346 research outputs found
Wiener algebra for the quaternions
We define and study the counterpart of the Wiener algebra in the quaternionic
setting, both for the discrete and continuous case. We prove a Wiener-L\'evy
type theorem and a factorization theorem. We give applications to Toeplitz and
Wiener-Hopf operators
Memory functions and Correlations in Additive Binary Markov Chains
A theory of additive Markov chains with long-range memory, proposed earlier
in Phys. Rev. E 68, 06117 (2003), is developed and used to describe statistical
properties of long-range correlated systems. The convenient characteristics of
such systems, a memory function, and its relation to the correlation properties
of the systems are examined. Various methods for finding the memory function
via the correlation function are proposed. The inverse problem (calculation of
the correlation function by means of the prescribed memory function) is also
solved. This is demonstrated for the analytically solvable model of the system
with a step-wise memory function.Comment: 11 pages, 5 figure
Thomas Decomposition and Nonlinear Control Systems
This paper applies the Thomas decomposition technique to nonlinear control
systems, in particular to the study of the dependence of the system behavior on
parameters. Thomas' algorithm is a symbolic method which splits a given system
of nonlinear partial differential equations into a finite family of so-called
simple systems which are formally integrable and define a partition of the
solution set of the original differential system. Different simple systems of a
Thomas decomposition describe different structural behavior of the control
system in general. The paper gives an introduction to the Thomas decomposition
method and shows how notions such as invertibility, observability and flat
outputs can be studied. A Maple implementation of Thomas' algorithm is used to
illustrate the techniques on explicit examples
The Magnus expansion and some of its applications
Approximate resolution of linear systems of differential equations with
varying coefficients is a recurrent problem shared by a number of scientific
and engineering areas, ranging from Quantum Mechanics to Control Theory. When
formulated in operator or matrix form, the Magnus expansion furnishes an
elegant setting to built up approximate exponential representations of the
solution of the system. It provides a power series expansion for the
corresponding exponent and is sometimes referred to as Time-Dependent
Exponential Perturbation Theory. Every Magnus approximant corresponds in
Perturbation Theory to a partial re-summation of infinite terms with the
important additional property of preserving at any order certain symmetries of
the exact solution. The goal of this review is threefold. First, to collect a
number of developments scattered through half a century of scientific
literature on Magnus expansion. They concern the methods for the generation of
terms in the expansion, estimates of the radius of convergence of the series,
generalizations and related non-perturbative expansions. Second, to provide a
bridge with its implementation as generator of especial purpose numerical
integration methods, a field of intense activity during the last decade. Third,
to illustrate with examples the kind of results one can expect from Magnus
expansion in comparison with those from both perturbative schemes and standard
numerical integrators. We buttress this issue with a revision of the wide range
of physical applications found by Magnus expansion in the literature.Comment: Report on the Magnus expansion for differential equations and its
applications to several physical problem
Incidence of first primary central nervous system tumors in California, 2001–2005
We examined the incidence of first primary central nervous system tumors (PCNST) in California from 2001–2005. This study period represents the first five years of data collection of benign PCNST by the California Cancer Registry. California’s age-adjusted incidence rates (AAIR) for malignant and benign PCNST (5.5 and 8.5 per 100,000, respectively). Malignant PCNST were highest among non-Hispanic white males (7.8 per 100,000). Benign PCNST were highest among African American females (10.5 per 100,000). Hispanics, those with the lowest socioeconomic status, and those who lived in rural California were found to be significantly younger at diagnosis. Glioblastoma was the most frequent malignant histology, while meningioma had the highest incidence among benign histologies (2.6 and 4.5 per 100,000, respectively). This study is the first in the US to compare malignant to benign PCNST using a population-based data source. It illustrates the importance of PCNST surveillance in California and in diverse communities
Hierarchical Models in the Brain
This paper describes a general model that subsumes many parametric models for
continuous data. The model comprises hidden layers of state-space or dynamic
causal models, arranged so that the output of one provides input to another. The
ensuing hierarchy furnishes a model for many types of data, of arbitrary
complexity. Special cases range from the general linear model for static data to
generalised convolution models, with system noise, for nonlinear time-series
analysis. Crucially, all of these models can be inverted using exactly the same
scheme, namely, dynamic expectation maximization. This means that a single model
and optimisation scheme can be used to invert a wide range of models. We present
the model and a brief review of its inversion to disclose the relationships
among, apparently, diverse generative models of empirical data. We then show
that this inversion can be formulated as a simple neural network and may provide
a useful metaphor for inference and learning in the brain
- …