535,739 research outputs found
Testing I(1) against I(d) alternatives with Wald Tests in the presence of deterministic components
This paper analyses how to test I(1) against I(d), d<1, in the presence of deterministic components in the DGP, by extending a Wald-type test, i.e., the (Efficient) Fractional Dickey-Fuller (EFDF) test, to this case. Tests of these hypotheses are important in many economic applications where it is crucial to distinguish between permanent and transitory shocks because I(d) processes with d<1 are mean-reverting. On top of it,
the inclusion of deterministic components becomes a necessary addition in order to analyze most macroeconomic variables. We show how simple is the implementation of the EFDF in these situations and argue that, in general, has better properties than LM tests. Finally, an empirical application is
provided where the EFDF approach allowing for deterministic components is used to test for long-memory in the GDP p.c. of several OECD countries, an issue that has important consequences to discriminate between growth
theories, and on which there has been some controversy
A critical literature review of the effectiveness of various instruments in the diagnosis of dementia in adults with intellectual disabilities
AbstractIntroductionCurrently, there is no consensus on dementia diagnostics in adults with intellectual disabilities (ID). There are three types of assessments available: direct cognitive tests, test batteries, and informant reports.MethodsA systematic literature search was conducted in four databases yielding 9840 records. Relevant studies were identified and selected using predefined inclusion and exclusion criteria and then coded and classified according to assessment type. This was completed by two independent researchers, with a third consulted when discrepancies arose. The review collates diagnostic instruments and presents strengths and weaknesses.ResultsOverall 47 studies met the search criteria, and 43 instruments were extracted from the selected studies. Of which, 10 instruments were classified as test batteries, 23 were classified as direct cognitive tests, and the remaining 10 were informant reports.DiscussionThis review can recommend that cognitive test batteries can offer the most practical and efficient method for dementia diagnosis in individuals with ID
Measure of combined effects of morphological parameters of inclusions within composite materials via stochastic homogenization to determine effective mechanical properties
In our previous papers we have described efficient and reliable methods of
generation of representative volume elements (RVE) perfectly suitable for
analysis of composite materials via stochastic homogenization.
In this paper we profit from these methods to analyze the influence of the
morphology on the effective mechanical properties of the samples. More
precisely, we study the dependence of main mechanical characteristics of a
composite medium on various parameters of the mixture of inclusions composed of
spheres and cylinders. On top of that we introduce various imperfections to
inclusions and observe the evolution of effective properties related to that.
The main computational approach used throughout the work is the FFT-based
homogenization technique, validated however by comparison with the direct
finite elements method. We give details on the features of the method and the
validation campaign as well.
Keywords: Composite materials, Cylindrical and spherical reinforcements,
Mechanical properties, Stochastic homogenization.Comment: 23 pages, updated figures, version accepted to Composite Structures
201
Computation in generalised probabilistic theories
From the existence of an efficient quantum algorithm for factoring, it is
likely that quantum computation is intrinsically more powerful than classical
computation. At present, the best upper bound known for the power of quantum
computation is that BQP is in AWPP. This work investigates limits on
computational power that are imposed by physical principles. To this end, we
define a circuit-based model of computation in a class of operationally-defined
theories more general than quantum theory, and ask: what is the minimal set of
physical assumptions under which the above inclusion still holds? We show that
given only an assumption of tomographic locality (roughly, that multipartite
states can be characterised by local measurements), efficient computations are
contained in AWPP. This inclusion still holds even without assuming a basic
notion of causality (where the notion is, roughly, that probabilities for
outcomes cannot depend on future measurement choices). Following Aaronson, we
extend the computational model by allowing post-selection on measurement
outcomes. Aaronson showed that the corresponding quantum complexity class is
equal to PP. Given only the assumption of tomographic locality, the inclusion
in PP still holds for post-selected computation in general theories. Thus in a
world with post-selection, quantum theory is optimal for computation in the
space of all general theories. We then consider if relativised complexity
results can be obtained for general theories. It is not clear how to define a
sensible notion of an oracle in the general framework that reduces to the
standard notion in the quantum case. Nevertheless, it is possible to define
computation relative to a `classical oracle'. Then, we show there exists a
classical oracle relative to which efficient computation in any theory
satisfying the causality assumption and tomographic locality does not include
NP.Comment: 14+9 pages. Comments welcom
Symbolic Algorithms for Language Equivalence and Kleene Algebra with Tests
We first propose algorithms for checking language equivalence of finite
automata over a large alphabet. We use symbolic automata, where the transition
function is compactly represented using a (multi-terminal) binary decision
diagrams (BDD). The key idea consists in computing a bisimulation by exploring
reachable pairs symbolically, so as to avoid redundancies. This idea can be
combined with already existing optimisations, and we show in particular a nice
integration with the disjoint sets forest data-structure from Hopcroft and
Karp's standard algorithm. Then we consider Kleene algebra with tests (KAT), an
algebraic theory that can be used for verification in various domains ranging
from compiler optimisation to network programming analysis. This theory is
decidable by reduction to language equivalence of automata on guarded strings,
a particular kind of automata that have exponentially large alphabets. We
propose several methods allowing to construct symbolic automata out of KAT
expressions, based either on Brzozowski's derivatives or standard automata
constructions. All in all, this results in efficient algorithms for deciding
equivalence of KAT expressions
Bactericidal effect of corona discharges in atmospheric air
The present paper explores the possibilities of using impulsive and steady-state corona discharges for bio-decontamination operations. A high tension tubular corona electrode was stressed with positive or negative dc voltage with magnitude up to 26 kV, and a grounded mesh was used as an opposite electrode. Different operational regimes of this corona generator were investigated for the production of ozone in air flow and the inactivation of microorganisms. The test microorganisms used in this work were Escherichia coli and Staphylococcus aureus, populations of which were seeded onto agar plates. These bacterial plates were located behind the grounded mesh electrode to assess bactericidal efficacy. The results show that corona discharges have a strong bactericidal effect, for example positive flashing corona discharges were able to reduce populations of the test microorganism by 94% within a 30-60 sec time interval. Negative steady-state corona discharges also produce noticeable bactericidal effect, reducing population of E. coli and S. aureus by more than 97% within 120 sec energisation interval. The bactericidal efficiency of different corona discharge modes and its correlation with ozone levels produced by these discharges is discussed. The results obtained in this work will help in the design and development of compact plasma systems for environmental application
Multiple shifts and fractional integration in the us and uk unemployment rates
This paper analyses the long-run behaviour of the US and UK unemployment rates by testing for possibly fractional orders of integration and multiple shifts using a sample of over 100 annual observations. The results show that the orders of integration are higher than 0 in both series, which implies long memory. If we assume that the underlying disturbances are white noise, the values are higher than 0.5, i.e., nonstationary. However, if the disturbances are autocorrelated, the orders of integration are in the interval (0, 0.5), implying stationarity and mean-reverting behaviour. Moreover, when multiple shifts are taken into account, unemployment is more persistent in the US than in the UK, implying the need for stronger policy action in the former to bring unemployment back to its original level
- …