437 research outputs found
Evaluating a framework of theoretical hypotheses for animation learning
This paper presents a set of theoretical hypotheses suggesting various relationships between didactical setting and learning effects with animations. Particularly, we investigated whether individual flow-control adequately provides didactical means to reduce the cognitive load imposed by animations. We did not find an effect of individual flow control, probably due to the fact that this learning condition was embedded in a setting where not enough verbal information was offered together with the graphical animation. Overall the multimedia effects found in this study are in line with known principles of didactical multimedia design. Further, this study sheds light on theoretical aspects involved in the complex interaction between learning content, presentation, learning and resulting knowledg
Bayesian multi-model projection of climate: bias assumptions and interannual variability
Current climate change projections are based on comprehensive multi-model ensembles of global and regional climate simulations. Application of this information to impact studies requires a combined probabilistic estimate taking into account the different models and their performance under current climatic conditions. Here we present a Bayesian statistical model for the distribution of seasonal mean surface temperatures for control and scenario periods. The model combines observational data for the control period with the output of regional climate models (RCMs) driven by different global climate models (GCMs). The proposed Bayesian methodology addresses seasonal mean temperatures and considers both changes in mean temperature and interannual variability. In addition, unlike previous studies, our methodology explicitly considers model biases that are allowed to be time-dependent (i.e. change between control and scenario period). More specifically, the model considers additive and multiplicative model biases for each RCM and introduces two plausible assumptions ("constant biasâ and "constant relationshipâ) about extrapolating the biases from the control to the scenario period. The resulting identifiability problem is resolved by using informative priors for the bias changes. A sensitivity analysis illustrates the role of the informative prior. As an example, we present results for Alpine winter and summer temperatures for control (1961-1990) and scenario periods (2071-2100) under the SRES A2 greenhouse gas scenario. For winter, both bias assumptions yield a comparable mean warming of 3.5-3.6°C. For summer, the two different assumptions have a strong influence on the probabilistic prediction of mean warming, which amounts to 5.4°C and 3.4°C for the "constant biasâ and "constant relationâ assumptions, respectively. Analysis shows that the underlying reason for this large uncertainty is due to the overestimation of summer interannual variability in all models considered. Our results show the necessity to consider potential bias changes when projecting climate under an emission scenario. Further work is needed to determine how bias information can be exploited for this tas
Are spectroscopic factors from transfer reactions consistent with asymptotic normalisation coefficients?
It is extremely important to devise a reliable method to extract
spectroscopic factors from transfer cross sections. We analyse the standard
DWBA procedure and combine it with the asymptotic normalisation coefficient,
extracted from an independent data set. We find that the single particle
parameters used in the past generate inconsistent asymptotic normalization
coefficients. In order to obtain a consistent spectroscopic factor,
non-standard parameters for the single particle overlap functions can be used
but, as a consequence, often reduced spectroscopic strengths emerge. Different
choices of optical potentials and higher order effects in the reaction model
are also studied. Our test cases consist of: C(d,p)C(g.s.) at
MeV, O(d,p)O(g.s.) at MeV and
Ca(d,p)Ca(g.s.) at MeV. We underline the
importance of performing experiments specifically designed to extract ANCs for
these systems.Comment: 15 pages, 12 figures, Phys. Rev. C (in press
Monitoring of Carboxypeptidase Digestion by Matrix-Assisted Laser Desorption and Ionization Mass Spectrometry
The potential of matrix-assisted laser desorption and ionization mass spectrometry (LDI-MS) is demonstrated by monitoring and analyzing the digestion of (human) pTH (1â34), a synthetic peptide with carboxypeptidases Y and B. All occurring ion signals in the mass spectra could
be identified as degraded peptides. By calculating the mass differences between successive degraded peptides, it was possible to identify the released amino acids and to determine 8 amino acids of the C-terminus of the original peptide. For a single MS measurement, only 2 pmol of substrate
was needed. Time-course analysis of the cleavage of the first amino acid residue gave insight into the kinetics involved. These measurements strongly support the hope that quantitative information about concentrations can be extracted from LDI-MS
Serological evidence of human granulocytic ehrlichiosis in Switzerland
To investigate whether human granulocytic ehrlichiosis (HGE) is prevalent in Switzerland, 1515 human serum samples from individuals with different risks for tick exposure were tested for antibodies toEhrlichia phagocytophila, a surrogate marker of the agent of HGE. The distribution of titres showed marked differences between sera of individuals with no or low risk for tick exposure and those with a high risk. The results of serological testing provided evidence of HGE in Switzerland as well as evidence of two types of coinfections: those with the agent of HGE andBorrelia burgdorferi, and those with the agent of HGE and the central European tickborne encephalitis viru
Strongyloides stercoralis larvae excretion patterns before and after treatment
The variability of larval excretion impedes the parasitological diagnosis of Strongyloides stercoralis in infected individuals. We assessed the number of larvae excreted per gram (LPG) stool in 219 samples from 38 infected individuals over 7 consecutive days before and in 470 samples from 44 persons for 21 consecutive days after ivermectin treatment (200ÎŒgkgâ1 BW). The diagnostic sensitivity of a single stool sample was about 75% for individuals with low-intensity infections (â©œ1 LPG) and increased to 95% for those with high-intensity infections (â©Ÿ10 LPG). Doubling the number of samples examined per person increased sensitivity to more than 95%, even for low-intensity infections. There was no indication of a cyclic excretion of larvae. After treatment, all individuals stopped excreting larvae within 3 days. Larvae were not detected during any of the following 18 days (total 388 Baermann and 388 Koga Agar tests). Two stool samples, collected on consecutive days, are recommended in settings where low or heterogeneous infection intensities are likely. In this way, taking into account the possible biological variability in excretion, the efficacy of ivermectin treatment can be assessed as soon as 4 days after treatmen
Kilometer-scale climate models: Prospects and challenges
Currently major efforts are underway toward refining the horizontal resolution (or grid spacing) of climate models to about 1 km, using both global and regional climate models (GCMs and RCMs). Several groups have succeeded in conducting kilometer-scale multiweek GCM simulations and decadelong continental-scale RCM simulations. There is the well-founded hope that this increase in resolution represents a quantum jump in climate modeling, as it enables replacing the parameterization of moist convection by an explicit treatment. It is expected that this will improve the simulation of the water cycle and extreme events and reduce uncertainties in climate change projections. While kilometer-scale resolution is commonly employed in limited-area numerical weather prediction, enabling it on global scales for extended climate simulations requires a concerted effort. In this paper, we exploit an RCM that runs entirely on graphics processing units (GPUs) and show examples that highlight the prospects of this approach. A particular challenge addressed in this paper relates to the growth in output volumes. It is argued that the data avalanche of high-resolution simulations will make it impractical or impossible to store the data. Rather, repeating the simulation and conducting online analysis will become more efficient. A prototype of this methodology is presented. It makes use of a bit-reproducible model version that ensures reproducible simulations across hardware architectures, in conjunction with a data virtualization layer as a common interface for output analyses. An assessment of the potential of these novel approaches will be provided
- âŠ