6,524 research outputs found
Pulsar Science with the SKA
The SKA will be transformational for many areas of science, but in particular
for the study of neutron stars and their usage as tools for fundamental physics
in the form of radio pulsars. Since the last science case for the SKA, numerous
and unexpected advances have been made broadening the science goals even
further. With the design of SKA Phase 1 being finalised, it is time to confront
the new knowledge in this field, with the prospects promised by this exciting
new telescope. While technically challenging, we can build our expectations on
recent discoveries and technical developments that have reinforced our previous
science goals.Comment: 12 pages, 2 figures, to be published in: "Advancing Astrophysics with
the Square Kilometre Array", Proceedings of Science, PoS(AASKA14)03
NNLO contributions to jet photoproduction and determination of \alpha_s
We present the first calculation of inclusive jet photoproduction with
next-to-next-to-leading order (NNLO) contributions, obtained from a unified
threshold resummation formalism. The leading coefficients for direct
photoproduction are computed analytically. Together with the coefficients
pertinent to parton-parton scattering, they are shown to agree with those
appearing in our full next-to-leading order calculations. For hadron-hadron
scattering, numerical agreement is found with a previous calculation of jet
production at the Tevatron. We show that the direct and resolved NNLO
contributions considerably improve the description of final ZEUS data on jet
photoproduction and that the error on the determination of the strong coupling
constant is significantly reduced.Comment: 4 pages, 3 figure
Alternating model trees
Model tree induction is a popular method for tackling regression problems requiring interpretable models. Model trees are decision trees with multiple linear regression models at the leaf nodes. In this paper, we propose a method for growing alternating model trees, a form of option tree for regression problems. The motivation is that alternating decision trees achieve high accuracy in classification problems because they represent an ensemble classifier as a single tree structure. As in alternating decision trees for classifi-cation, our alternating model trees for regression contain splitter and prediction nodes, but we use simple linear regression functions as opposed to constant predictors at the prediction nodes. Moreover, additive regression using forward stagewise modeling is applied to grow the tree rather than a boosting algorithm. The size of the tree is determined using cross-validation. Our empirical results show that alternating model trees achieve significantly lower squared error than standard model trees on several regression datasets
- …