939 research outputs found
Leveraging process data to assess adults' problem-solving skills: Using sequence mining to identify behavioral patterns across digital tasks
This paper illustrates how process data can be used to identify behavioral patterns in a computer-based problem-solving assessment. Using sequence-mining techniques, we identify patterns of behavior across multiple digital tasks from the sequences of actions undertaken by respondents. We then examine how respondentsâ action sequences (which we label âstrategiesâ) differ from optimal strategies. In our application, optimality is defined ex-ante as the sequence of actions that content experts involved in the development of the assessment tasks identified as most efficient to solve the task given the range of possible actions available to test-takers. Data on 7462 respondents from five countries (the United Kingdom, Ireland, Japan, the Netherlands, and the United States) participating in the Problem Solving in Technology-Rich Environment (PSTRE) assessment, administered as part of the OECD Programme for the International Assessment of Adult Competencies (PIAAC), indicate that valuable insights can be derived from the analysis of process data. Adults who follow optimal strategies are more likely to obtain high scores in the PSTRE assessment, while low performers consistently adopt strategies that are very distant from optimal ones. Very few high performers are able to solve the items in an efficient way, i.e. by minimizing the number of actions and by avoiding undertaking unnecessary or redundant actions. Women and adults above the age of 40 are more likely to adopt sub-optimal problem-solving strategies
The evolution of gender gaps in numeracy and literacy between childhood and young adulthood
Numeracy and literacy are important foundation skills which command significant wage premia in modern labour markets. The existence of gender differences in these skills is therefore of potential concern, and has spurred a large amount of research, especially with respect to numeracy skills. Still, little is known about the moment in which such gaps emerge, how they evolve, and if this evolution differs across countries. We use data from large scale international assessments to follow representative samples of birth-cohorts over time, and analyse how gender gaps in numeracy and literacy evolve from age 10 to age 27. We find that the advantage of boys in numeracy is small at age 10, but grows considerably between age 15 and 27. The gender gap in literacy follows a very different pattern: it is small at age 10, large and in favour of girls at age 15, and negligible by age 27
The use of meteorological analogues to account for LAM QPF uncertainty
International audienceFlood predictions issued employing quantitative precipitation forecasts (QPFs) provided by deterministic models do not account for the uncertainty in the outcomes. A probabilistic approach to QPF seems to be indispensable to obtain different future flow scenarios that allow to manage the flood accounting for the variability of phenomena and the uncertainty associated with an hydrological forecast. A new approach based on a search for past situations (analogues), similar to previous and current day in terms of different meteorological fields over Western Europe and East Atlantic, has been developed to determine an ensemble of hourly quantitative precipitation forecasts for the Reno river basin, a medium-sized catchment in northern Italy. A statistical analysis, performed over an hydro-meteorological archive collecting ECMWF analyses at 12:00 UTC relative to the autumn seasons ranging from 1990 to 2000 and the corresponding precipitation measurements recorded by the raingauges spread over the catchment of interest, has underlined that the combination of geopotential at 500 hPa and vertical velocity at 700 hPa provides a better estimation of precipitation. The analogue-based ensemble prediction has to be considered not alternative but complementary with the deterministic QPF provided by a numerical model, even in view of a joint employment to improve real-time flood forecasting. In the present study, the analogue-based QPFs and the precipitation forecast provided by the Limited Area Model LAMBO have been used as different input to the distributed rainfall-runoff model TOPKAPI, thus generating, respectively, an ensemble of discharge forecasts, which provides a confidence interval for the predicted streamflow, and a deterministic discharge forecast taken as an error affected "measurement'' of the future flow, which does not convey any quantification of the forecast uncertainty. To make more informative the hydrological prediction, the ensemble spread could be regarded as a measure of the uncertainty of the deterministic forecast
The use of meteorological analogues to account for LAM QPF uncertainty
International audienceFlood predictions based on quantitative precipitation forecasts (QPFs) provided by deterministic models do not account for the uncertainty in the outcomes. A probabilistic approach to QPF, one which accounts for the variability of phenomena and the uncertainty associated with a hydrological forecast, seems to be indispensable to obtain different future flow scenarios for improved flood management. A new approach based on a search for analogues, that is past situations similar to the current one under investigation in terms of different meteorological fields over Western Europe and East Atlantic, has been developed to determine an ensemble of hourly quantitative precipitation forecasts for the Reno river basin, a medium-sized catchment in northern Italy. A statistical analysis, performed over a hydro-meteorological archive of ECMWF analyses at 12:00 UTC relative to the autumn seasons ranging from 1990 to 2000 and the corresponding precipitation measurements recorded by the raingauges spread over the catchment of interest, has underlined that the combination of geopotential at 500 hPa and vertical velocity at 700 hPa provides a better estimation of precipitation. The analogue-based ensemble prediction has to be considered not alternative but complementary to the deterministic QPF provided by a numerical model, even when employed jointly to improve real-time flood forecasting. In the present study, the analogue-based QPFs and the precipitation forecast provided by the Limited Area Model LAMBO have been used as different input to the distributed rainfall-runoff model TOPKAPI, thus generating, respectively, an ensemble of discharge forecasts, which provides a confidence interval for the predicted streamflow, and a deterministic discharge forecast taken as an error-affected "measurement" of the future flow, which does not convey any quantification of the forecast uncertainty. To make more informative the hydrological prediction, the ensemble spread could be regarded as a measure of the uncertainty of the deterministic forecast
Performance of the ARPA-SMR limited-area ensemble prediction system: two flood cases
The performance of the ARPA-SMR Limited-area Ensemble Prediction System (LEPS), generated by nesting a limited-area model on selected members of the ECMWF targeted ensemble, is evaluated for two flood events that occurred during September 1992. The predictability of the events is studied for forecast times ranging from 2 to 4 days. The extent to which floods localised in time and space can be forecast at high resolution in probabilistic terms was investigated. Rainfall probability maps generated by both LEPS and ECMWF targeted ensembles are compared for different precipitation thresholds in order to assess the impact of enhanced resolution. At all considered forecast ranges, LEPS performs better, providing a more accurate description of the event with respect to the spatio-temporal location, as well as its intensity. In both flood cases, LEPS probability maps turn out to be a very valuable tool to assist forecasters to issue flood alerts at different forecast ranges. It is also shown that at the shortest forecast range, the deterministic prediction provided by the limited area model, when run in a higher-resolution configuration, provides a very accurate rainfall pattern and a good quantitative estimate of the total rainfall deployed in the flooded regions
Lâimportanza della fase interpretativa nelle ricerche di neuromarketing per innovare il marketing digitale e strategico
Il neuromarketing e i suoi strumenti si stanno evolvendo rapidamente e molti studiosi chiedono di perfezionarle con unâinterpretazione dei dati piĂš mirata a dare valore al business e migliori risposte al marketing strategico. Lâobiettivo di questo articolo è di comprendere se il ruolo di una fase interpretativa dei risultati adeguatamente condotta possa contribuire a colmare questo gap. Per raggiungere questo obiettivo è stato necessario sviluppare un progetto di ricerca basato su un mixed methods design, e comprensivo di esperimento neuroscientifico (A/B test, con utilizzo di eye tracker fisso, e analisi tramite fixation, heat e opacity maps), behavioural test (osservazione e questionari comportamentali post stimolo), questionario finale, fase interpretativa di follow up. Il progetto è stato condotto da un team di ricerca misto, composto da due ricercatori universitari, un partner tecnico esperto di neuromarketing, e un manager dellâimpresa in esame. Il caso scelto è stato applicato alla comunicazione digitale (landing page) collegata al lancio di un prodotto innovativo, reale e nuovo sia nel concetto che nel nome del brand. Gli undici soggetti partecipanti sono potenziali acquirenti volontari, sollecitati da alcuni incentivi. Lo studio è stato approfondito, producendo numerosi risultati in tutte le fasi della ricerca che sono stati utili per migliorare il processo di codifica della comunicazione, cosĂŹ da ottimizzare la digital experience della landing page. Ă stata però lâultima fase di discussione dei risultati da parte di diversi esperti che ha apportato maggior valore per innescare un vero cambiamento nel processo decisionale di marketing dellâazienda, confermando lâimportanza della fase interpretativa nelle ricerche di neuromarketi
The Soverato flood in Southern Italy: performance of global and limited-area ensemble forecasts
The predictability of the flood event affecting Soverato (Southern Italy) in September 2000 is investigated by considering three different configurations of ECMWF ensemble: the operational Ensemble Prediction System (EPS), the targeted EPS and a high-resolution version of EPS. For each configuration, three successive runs of ECMWF ensemble with the same verification time are grouped together so as to generate a highly-populated "super-ensemble". Then, five members are selected from the super-ensemble and used to provide initial and boundary conditions for the integrations with a limited-area model, whose runs generate a Limited-area Ensemble Prediction System (LEPS). The relative impact of targeting the initial perturbations against increasing the horizontal resolution is assessed for the global ensembles as well as for the properties transferred to LEPS integrations, the attention being focussed on the probabilistic prediction of rainfall over a localised area. At the 108, 84 and 60- hour forecast ranges, the overall performance of the global ensembles is not particularly accurate and the best results are obtained by the high-resolution version of EPS. The LEPS performance is very satisfactory in all configurations and the rainfall maps show probability peaks in the correct regions. LEPS products would have been of great assistance to issue flood risk alerts on the basis of limited-area ensemble forecasts. For the 60-hour forecast range, the sensitivity of the results to the LEPS ensemble size is discussed by comparing a 5-member against a 51-member LEPS, where the limited-area model is nested on all EPS members. Little sensitivity is found as concerns the detection of the regions most likely affected by heavy precipitation, the probability peaks being approximately the same in both configurations
Microturbulence studies in RFX-mod
Present-days Reversed Field Pinches (RFPs) are characterized by quasi-laminar
magnetic configurations in their core, whose boundaries feature sharp internal
transport barriers, in analogy with tokamaks and stellarators. The abatement of
magnetic chaos leads to the reduction of associated particle and heat transport
along wandering field lines. At the same time, the growth of steep temperature
gradients may trigger drift microinstabilities. In this work we summarize the
work recently done in the RFP RFX-mod in order to assess the existence and the
impact upon transport of such electrostatic and electromagnetic
microinstabilities as Ion Temperature Gradient (ITG), Trapped Electron Modes
(TEM) and microtearing modes.Comment: Work presented at the 2010 Varenna workshop "Theory of Fusion
Plasmas". To appear in Journal of Physics Conference Serie
Downscaling With an Unstructured Coastal-Ocean Model to the Goro Lagoon and the Po River Delta Branches
The Goro Lagoon Finite Element Model (GOLFEM) presented in this paper concentrates on the high-resolution downscaled model of the Goro Lagoon, along with five Po river branches and the coastal area of the Po delta in the northern Adriatic Sea (Italy) where crucial socio-economic activities take place. GOLFEM was validated by means of validation scores (bias â BIAS, root mean square error â RMSE, and mean absolute error â MAE) for the water level, current velocity, salinity and temperature measured at several fixed stations in the lagoon. The range of scores at the stations are: for temperature between â0.8 to +1.2°C, for salinity from â0.2 to 5 PSU, for sea level 0.1 m. The lagoon is dominated by an estuarine vertical circulation due to a double opening at the lagoon mouth and sustained by multiple sources of freshwater inputs. The non-linear interactions among the tidal forcing, the wind and the freshwater inputs affect the lagoon circulation at both seasonal and daily time scales. The sensitivity of the circulation to the forcings was analyzed with several sensitivity experiments done with the exclusion of the tidal forcing and different configurations of the river connections. GOLFEM was designed to resolve the lagoon dynamics at high resolution in order to evaluate the potential effects on the clam farming of two proposed scenarios of human intervention on the morphology of the connection with the sea. We calculated the changes of the lagoon current speed and salinity, and using opportune fitness indexes related to the clams physiology, we quantified analytically the effects of the interventions in terms of extension and persistence of areas of the clams optimal growth. The results demonstrate that the correct management of this kind of fragile environment relies on both long-term (intervention scenarios) and short-term (coastal flooding forecasts and potential anoxic conditions) modeling, based on a flexible tool that is able to consider all the recorded human interventions on the river connections. This study also demonstrates the importance of designing a seamless chain of models that are capable of integrating local effects into the coarser operational oceanographic models
- âŚ