1,621 research outputs found
Outlier Detection and Missing Value Estimation in Time Series Traffic Count Data: Final Report of SERC Project GR/G23180.
A serious problem in analysing traffic count data is what to do when missing or extreme values occur, perhaps as a result of a breakdown in automatic counting equipment. The objectives of this current work were to attempt to look at ways of solving this problem by:
1)establishing the applicability of time series and influence function techniques for estimating missing values and detecting outliers in time series traffic data;
2)making a comparative assessment of new techniques with those used by traffic engineers in practice for local, regional or national traffic count systems
Two alternative approaches were identified as being potentially useful and these were evaluated and compared with methods currently employed for `cleaning' traffic count series. These were based on evaluating the effect of individual or groups of observations on the estimate of the auto-correlation structure and events influencing a parametric model (ARIMA).
These were compared with the existing methods which included visual inspection and smoothing techniques such as the exponentially weighted moving average in which means and variances are updated using observations from the same time and day of week.
The results showed advantages and disadvantages for each of the methods.
The exponentially weighted moving average method tended to detect unreasonable outliers and also suggested replacements which were consistently larger than could reasonably be expected.
Methods based on the autocorrelation structure were reasonably successful in detecting events but the replacement values were suspect particularly when there were groups of values needing replacement. The methods also had problems in the presence of non-stationarity, often detecting outliers which were really a result of the changing level of the data rather than extreme values. In the presence of other events, such as a change in level or seasonality, both the influence function and change in autocorrelation present problems of interpretation since there is no way of distinguishing these events from outliers.
It is clear that the outlier problem cannot be separated from that of identifying structural changes as many of the statistics used to identify outliers also respond to structural changes. The ARIMA (1,0,0)(0,1,1)7 was found to describe the vast majority of traffic count series which means that the problem of identifying a starting model can largely be avoided with a high degree of assurance.
Unfortunately it is clear that a black-box approach to data validation is prone to error but methods such as those described above lend themselves to an interactive graphics data-validation technique in which outliers and other events are highlighted requiring acceptance or otherwise manually. An adaptive approach to fitting the model may result in something which can be more automatic and this would allow for changes in the underlying model to be accommodated.
In conclusion it was found that methods based on the autocorrelation structure are the most computationally efficient but lead to problems of interpretation both between different types of event and in the presence of non-stationarity. Using the residuals from a fitted ARIMA model is the most successful method at finding outliers and distinguishing them from other events, being less expensive than case deletion. The replacement values derived from the ARIMA model were found to be the most accurate
Setar Modelling of Traffic Count Data.
As part of a SERC funded project investigating outlier detection and replacement with transport data, univariate Box-Jenkins (1976) models have already been successfully applied to traffic count series (see Redfern et al, 1992). However, the underlying assumption of normality for ARIMA models implies they are not ideally suited for time series exhibiting certain behavioural characteristics. The limitations of ARIMA models are discussed in some detail by Tong (1983), including problems with time irreversibility, non-normality, cyclicity and asymmetry. Data with irregularly spaced extreme values are unlikely to be modelled well by ARIMA models, which are better suited to data where the probability of a very high value is small. Tong (1983) argues that one way of modelling such non-normal behaviour might be to retain the general ARIMA framework and allow the white noise element to be non-gaussian. As an alternative he proposes abandoning the linearity assumption and defines a group of non linear structures, one of which is the Self-Exciting Threshold Autoregressive (SETAR) model. The model form is described in more detail below but basically consists of two (or more) piecewise linear models, with the time series "tripping" between each model according to its value with respect to a threshold point. The model is called "Self-Exciting" because the indicator variable determining the appropriate linear model for each piece of data is itself a function of the data series. Intuitively this means the mechanism driving the alternation between each model form is not an external input such as a related time series (other models can be defined where this exists), but is actually contained within the series itself. The series is thus Self-Exciting.
The three concepts embedded within the SETAR model structure are those of the threshold, limit cycle and time delay, each of which can be illustrated by the diverse applications such models can take.
The threshold can be defined as some point beyond which, if the data falls, the series structure changes inherently and so an alternative linear model form would be appropriate. In hydrology this is seen as the non-linearity of soil infiltration, where at the soil saturation point (threshold) a new model for infiltration would become appropriate.
Limit cycles describe the stable cyclical phenomena which we sometimes observe within time series. The cyclical behaviour is stationary, ie consists of regular, sustained oscillations and is an intrinsic property of the data. The limit cycle phenomena is physically observable in the field of radio-engineering where a triode valve is used to generate oscillations (see Tong, 1983 for a full description). Essentially the triode value produces self-sustaining oscillations between emitting and collecting electrons, according to the voltage value of a grid placed between the anode and cathode (thereby acting as the threshold indicator).
The third essential concept within the SETAR structure is that of the time delay and is perhaps intuitively the easiest to grasp. It can be seen within the field of population biology where many types of non-linear model may apply. For example within the cyclical oscillations of blowfly population data there is an inbuilt "feedback" mechanism given by the hatching period for eggs, which would give rise to a time delay parameter within the model. For some processes this inherent delay may be so small as to be virtually instantaneous and so the delay parameter could be omitted.
In general time series Tong (1983) found the SETAR model well suited to the cyclical nature of the Canadian Lynx trapping series and for modelling riverflow systems (Tong, Thanoon & Gudmundsson, 1984). Here we investigate their applicability with time series traffic counts, some of which have exhibited the type of non-linear and cyclical characteristics which could undermine a straightforward linear modelling process
Efficacy of pimobendan in the prevention of congestive heart failure or sudden death in doberman pinschers with preclinical dilated cardiomyopathy (the PROTECT study)
<p>Background: The benefit of pimobendan in delaying the progression of preclinical dilated cardiomyopathy (DCM) in Dobermans is not reported.</p>
<p>Hypothesis: That chronic oral administration of pimobendan to Dobermans with preclinical DCM will delay the onset of CHF or sudden death and improve survival.</p>
<p>Animals: Seventy-six client-owned Dobermans recruited at 10 centers in the UK and North America.</p>
<p>Methods: The trial was a randomized, blinded, placebo-controlled, parallel group multicenter study. Dogs were allocated in a 1:1 ratio to receive pimobendan (Vetmedin capsules) or visually identical placebo.</p>
<p>The composite primary endpoint was prospectively defined as either onset of CHF or sudden death. Time to death from all causes was a secondary endpoint.</p>
<p>Results: The proportion of dogs reaching the primary endpoint was not significantly different between groups (P = .1). The median time to the primary endpoint (onset of CHF or sudden death) was significantly longer in the pimobendan (718 days, IQR 441–1152 days) versus the placebo group (441 days, IQR 151–641 days) (log-rank P = 0.0088). The median survival time was significantly longer in the pimobendan (623 days, IQR 491–1531 days) versus the placebo group (466 days, IQR 236–710 days) (log-rank P = .034).</p>
<p>Conclusion and Clinical Importance: The administration of pimobendan to Dobermans with preclinical DCM prolongs the time to the onset of clinical signs and extends survival. Treatment of dogs in the preclinical phase of this common cardiovascular disorder with pimobendan can lead to improved outcome.</p>
QCD Down Under: Building Bridges
The strong coupling regime of QCD is responsible for 99% of hadronic
phenomena. Though considerable progress has been made in solving QCD in this
non-perturbative region, we nevertheless have to rely on a disparate range of
models and approximations. If we are to gain an understanding of the underlying
physics and not just have numerical answers from computing `` black'' boxes, we
must build bridges between the parameter space where models and approximations
are valid to the regime describing experiment, and between the different
modellings of strong dynamics. We describe here how the
Schwinger-Dyson/Bethe-Salpeter approach provides just such a bridge, linking
physics, the lattice and experiment.Comment: 8 pages, 10 figures. Opening talk at Workshop on QCD Down Under,
March 2004, Barossa Valley and Adelaide (to be published in the Proceedings
Religious diversity, empathy, and God images : perspectives from the psychology of religion shaping a study among adolescents in the UK
Major religious traditions agree in advocating and promoting love of neighbour as well as love of God. Love of neighbour is reflected in altruistic behaviour and empathy stands as a key motivational factor underpinning altruism. This study employs the empathy scale from the Junior Eysenck Impulsiveness Questionnaire to assess the association between empathy and God images among a sample of 5993 religiously diverse adolescents (13–15 years old) attending state maintained schools in England, Northern Ireland, Scotland, Wales, and London. The key psychological theory being tested by these data concerns the linkage between God images and individual differences in empathy. The data demonstrate that religious identity (e.g. Christian, Muslim) and religious attendance are less important than the God images which young people hold. The image of God as a God of mercy is associated with higher empathy scores, while the image of God as a God of justice is associated with lower empathy scores
Spontaneous Creation of Inflationary Universes and the Cosmic Landscape
We study some gravitational instanton solutions that offer a natural
realization of the spontaneous creation of inflationary universes in the brane
world context in string theory. Decoherence due to couplings of higher
(perturbative) modes of the metric as well as matter fields modifies the
Hartle-Hawking wavefunction for de Sitter space. Generalizing this new
wavefunction to be used in string theory, we propose a principle in string
theory that hopefully will lead us to the particular vacuum we live in, thus
avoiding the anthropic principle. As an illustration of this idea, we give a
phenomenological analysis of the probability of quantum tunneling to various
stringy vacua. We find that the preferred tunneling is to an inflationary
universe (like our early universe), not to a universe with a very small
cosmological constant (i.e., like today's universe) and not to a 10-dimensional
uncompactified de Sitter universe. Such preferred solutions are interesting as
they offer a cosmological mechanism for the stabilization of extra dimensions
during the inflationary epoch.Comment: 52 pages, 7 figures, 1 table. Added discussion on supercritical
string vacua, added reference
Using Knowledge of Student Cognition to Differentiate Instruction
By all accounts, learning is a complex task that requires a student to use and apply a range of cognitive skills. A student\u27s ability to retain information while performing concurrent processing, often referred to as working memory (WM), is critical to the acquisition of increasingly more complex knowledge and skills. Not surprisingly, WM is often linked to successful learning and student academic achievement. According to the academic literature, WM is a very useful measure of a student\u27s capability to acquire new information². Most students are able to successfully respond to classroom instruction that requires them to rely on their WM to acquire new knowledge or skills. Unfortunately, some students struggle and ultimately fail to process information effectively which, in turn, negatively affects the outcome of instruction. In this article, we examine the relationship between student learning and the cognitive processes required to acquire new knowledge with a specific focus on WM and attention. We first offer a brief definition of WM and discuss ways that students apply WM to their daily lives. Then, based on the premise that learning requires both memory and attention, we discuss the role of WM and attention in classroom learning and in the acquisition of new knowledge. We highlight the interrelationship between WM and learning difficulties and disabilities among some students. After providing an understanding of the role of WM and attention in learning, we offer research-based strategies for differentiating instruction and addressing the diverse needs of students in an inclusive classroom
Measuring readiness-to-hand through differences in attention to the task vs. attention to the tool
New interaction techniques, like multi-touch, tangible inter-action, and mid-air gestures often promise to be more intuitive and natural; however, there is little work on how to measure these constructs. One way is to leverage the phenomenon of tool embodiment—when a tool becomes an extension of one’s body, attention shifts to the task at hand, rather than the tool itself. In this work, we constructed a framework to measure tool embodiment by incorporating philosophical and psychological concepts. We applied this framework to design and conduct a study that uses attention to measure readiness-to-hand with both a physical tool and a virtual tool. We introduce a novel task where participants use a tool to rotate an object, while simultaneously responding to visual stimuli both near their hand and near the task. Our results showed that participants paid more attention to the task than to both kinds of tool. We also discuss how this evaluation framework can be used to investigate whether novel interaction techniques allow for this kind of tool embodiment.Postprin
Radio emission and jets from microquasars
To some extent, all Galactic binary systems hosting a compact object are
potential `microquasars', so much as all galactic nuclei may have been quasars,
once upon a time. The necessary ingredients for a compact object of stellar
mass to qualify as a microquasar seem to be: accretion, rotation and magnetic
field. The presence of a black hole may help, but is not strictly required,
since neutron star X-ray binaries and dwarf novae can be powerful jet sources
as well. The above issues are broadly discussed throughout this Chapter, with a
a rather trivial question in mind: why do we care? In other words: are jets a
negligible phenomenon in terms of accretion power, or do they contribute
significantly to dissipating gravitational potential energy? How do they
influence their surroundings? The latter point is especially relevant in a
broader context, as there is mounting evidence that outflows powered by
super-massive black holes in external galaxies may play a crucial role in
regulating the evolution of cosmic structures. Microquasars can also be thought
of as a form of quasars for the impatient: what makes them appealing, despite
their low number statistics with respect to quasars, are the fast variability
time-scales. In the first approximation, the physics of the jet-accretion
coupling in the innermost regions should be set by the mass/size of the
accretor: stellar mass objects vary on 10^5-10^8 times shorter time-scales,
making it possible to study variable accretion modes and related ejection
phenomena over average Ph.D. time-scales. [Abridged]Comment: 28 pages, 13 figures, To appear in Belloni, T. (ed.): The Jet
Paradigm - From Microquasars to Quasars, Lect. Notes Phys. 794 (2009
The Infrared Behaviour of the Pure Yang-Mills Green Functions
We review the infrared properties of the pure Yang-Mills correlators and
discuss recent results concerning the two classes of low-momentum solutions for
them reported in literature; i.e. decoupling and scaling solutions. We will
mainly focuss on the Landau gauge and pay special attention to the results
inferred from the analysis of the Dyson-Schwinger equations of the theory and
from "{\it quenched}" lattice QCD. The results obtained from properly
interplaying both approaches are strongly emphasized.Comment: Final version to be published in FBS (54 pgs., 11 figs., 4 tabs
- …
