43 research outputs found
On the use of robust regression in econometrics
The use of robust regression estimators has gained popularity among applied econometricians. The main argument invoked to justify the use of the robust estimators is that they provide efficiency gains in the presence of outliers or non-normal errors. Unfortunately, most practitioners seem to be unaware of the fact that heteroskedastic and skewed errors can dramatically affect the properties of these estimators. In this paper we reconsider the interpretation of the specific robust estimator that has become popular in applied econometrics, and conclude that its use in this context cannot be generally recommended.
Kinetically Trapped Liquid-State Conformers of a Sodiated Model Peptide Observed in the Gas Phase
We investigate the peptide AcPheAla5LysH+, a model system for studying helix
formation in the gas phase, in order to fully understand the forces that
stabilize the helical structure. In particular, we address the question of
whether the local fixation of the positive charge at the peptide's C-terminus
is a prerequisite for forming helices by replacing the protonated C-terminal
Lys residue by Ala and a sodium cation. The combination of gas-phase
vibrational spectroscopy of cryogenically cooled ions with molecular
simulations based on density-functional theory (DFT) allows for detailed
structure elucidation. For sodiated AcPheAla6, we find globular rather than
helical structures, as the mobile positive charge strongly interacts with the
peptide backbone and disrupts secondary structure formation. Interestingly, the
global minimum structure from simulation is not present in the experiment. We
interpret that this is due to high barriers involved in re-arranging the
peptide-cation interaction that ultimately result in kinetically trapped
structures being observed in the experiment.Comment: 28 pages, 10 figure
On the use of robust regression in econometrics
The use of robust regression estimators has gained popularity among applied econometricians. The main argument invoked to justify the use of the robust estimators is that they provide efficiency gains in the presence of outliers or non-normal errors. Unfortunately, most practitioners seem to be unaware of the fact that heteroskedastic and skewed errors can dramatically affect the properties of these estimators. In this paper we reconsider the interpretation of the specific robust estimator that has become popular in applied econometrics, and conclude that its use in this context cannot be generally recommended
System-specific parameter optimization for non-polarizable and polarizable force fields
The accuracy of classical force fields (FFs) has been shown to be limited for
the simulation of cation-protein systems despite their importance in
understanding the processes of life. Improvements can result from optimizing
the parameters of classical FFs or by extending the FF formulation by terms
describing charge transfer and polarization effects. In this work, we introduce
our implementation of the CTPOL model in OpenMM, which extends the classical
additive FF formula by adding charge transfer (CT) and polarization (POL).
Furthermore, we present an open-source parameterization tool, called FFAFFURR
that enables the (system specific) parameterization of OPLS-AA and CTPOL
models. The performance of our workflow was evaluated by its ability to
reproduce quantum chemistry energies and by molecular dynamics simulations of a
Zinc finger protein.Comment: 62 pages and 25 figures (including SI), manuscript to be submitted
soo
Assessment and weighting of meteorological ensemble forecast members based on supervised machine learning with application to runoff simulations and flood warning
Numerical weather forecasts, such as meteorological forecasts of precipitation, are inherently uncertain. These uncertainties depend on model physics as well as initial and boundary conditions. Since precipitation forecasts form the input into hydrological models, the uncertainties of the precipitation forecasts result in uncertainties of flood forecasts. In order to consider these uncertainties, ensemble prediction systems are applied. These systems consist of several members simulated by different models or using a single model under varying initial and boundary conditions. However, a too wide uncertainty range obtained as a result of taking into account members with poor prediction skills may lead to underestimation or exaggeration of the risk of hazardous events. Therefore, the uncertainty range of model-based flood forecasts derived from the meteorological ensembles has to be restricted.
In this paper, a methodology towards improving flood forecasts by weighting ensemble members according to their skills is presented. The skill of each ensemble member is evaluated by comparing the results of forecasts corresponding to this member with observed values in the past. Since numerous forecasts are required in order to reliably evaluate the skill, the evaluation procedure is time-consuming and tedious. Moreover, the evaluation is highly subjective, because an expert who performs it makes his decision based on his implicit knowledge.
Therefore, approaches for the automated evaluation of such forecasts are required. Here, we present a semi automated approach for the assessment of precipitation forecast ensemble members. The approach is based on supervised machine learning and was tested on ensemble precipitation forecasts for the area of the Mulde river basin in Germany. Based on the evaluation results of the specific ensemble members, weights corresponding to their forecast skill were calculated. These weights were then successfully used to reduce the uncertainties within rainfall-runoff simulations and flood risk predictions
Forward K+ production in subthreshold pA collisions at 1.0 GeV
K+ meson production in pA (A = C, Cu, Au) collisions has been studied using
the ANKE spectrometer at an internal target position of the COSY-Juelich
accelerator. The complete momentum spectrum of kaons emitted at forward angles,
theta < 12 degrees, has been measured for a beam energy of T(p)=1.0 GeV, far
below the free NN threshold of 1.58 GeV. The spectrum does not follow a thermal
distribution at low kaon momenta and the larger momenta reflect a high degree
of collectivity in the target nucleus.Comment: 4 pages, 3 figure
Recommended from our members
The concurrent programming of saccades
Sequences of saccades have been shown to be prepared concurrently however it remains unclear exactly what aspects of those saccades are programmed in parallel. To examine this participants were asked to make one or two target-driven saccades: a reflexive saccade; a voluntary saccade; a reflexive then a voluntary saccade; or vice versa. During the first response the position of a second target was manipulated. The new location of the second saccade target was found to impact on second saccade latencies and second saccade accuracy showing that some aspects of the second saccade program are prepared in parallel with the first. However, differences were found in the specific pattern of effects for each sequence type. These differences fit well within a general framework for saccade control in which a common priority map for saccade control is computed and the influence of saccade programs on one another depends not so much on the types of saccade being produced but rather on the rate at which their programs develop