10,340 research outputs found
A cyclo-stationary complex multichannel wiener filter for the prediction of wind speed and direction
This paper develops a linear predictor for application to wind speed and direction forecasting in time and across different sites. The wind speed and direction are modelled via the magnitude and phase of a complex-valued time-series. A multichannel adaptive filter is set to predict this signal, based on its past values and the spatio-temporal correlation between wind signals measured at numerous geographical locations. The time-varying nature of the underlying system and the annual cycle of seasons motivates the development of a cyclo-stationary Wiener filter, which is tested on hourly mean wind speed and direction data from 13 weather stations across the UK, and shown to provide an improvement over both stationary Wiener filtering and a recent auto-regressive approach
A methodology for collecting valid software engineering data
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful
Crossover from quasi-static to dense flow regime in compressed frictional granular media
We investigate the evolution of multi-scale mechanical properties towards the
macroscopic mechanical instability in frictional granular media under
multiaxial compressive loading. Spatial correlations of shear stress
redistribution following nucleating contact sliding events and shear strain
localization are investigated. We report growing correlation lengths associated
to both shear stress and shear strain fields that diverge simultaneously as
approaching the transition to a dense flow regime. This shows that the
transition from quasi static to dense flow regime can be interpreted as a
critical phase transition. Our results suggest that no shear band with a
characteristic thickness has formed at the onset of instability
Diffusion coefficients for multi-step persistent random walks on lattices
We calculate the diffusion coefficients of persistent random walks on
lattices, where the direction of a walker at a given step depends on the memory
of a certain number of previous steps. In particular, we describe a simple
method which enables us to obtain explicit expressions for the diffusion
coefficients of walks with two-step memory on different classes of one-, two-
and higher-dimensional lattices.Comment: 27 pages, 2 figure
Structured Training for Neural Network Transition-Based Parsing
We present structured perceptron training for neural network transition-based
dependency parsing. We learn the neural network representation using a gold
corpus augmented by a large number of automatically parsed sentences. Given
this fixed network representation, we learn a final layer using the structured
perceptron with beam-search decoding. On the Penn Treebank, our parser reaches
94.26% unlabeled and 92.41% labeled attachment accuracy, which to our knowledge
is the best accuracy on Stanford Dependencies to date. We also provide in-depth
ablative analysis to determine which aspects of our model provide the largest
gains in accuracy
Analyzing Firm Performance in the Insurance Industry Using Frontier Efficiency Methods
In this introductory chapter to an upcoming book, the authors discuss the two principal types of efficiency frontier methodologies - the econometric (parametric) approach and the mathematical programming (non-parametric) approach. Frontier efficiency methodologies are discussed as useful in a variety of contexts: they can be used for testing economic hypotheses; providing guidance to regulators and policymakers; comparimg economic performance across countries; and informing management of the effects of procedures and strategies adapted by the firm. The econometric approach requires the specification of a production, cost, revenue, or profit function as well as assumptions about error terms. But this methodology is vulnerable to errors in the specification of the functional form or error term. The mathematical programming or linear programming approach avoids this type of error and measures any departure from the frontier as a relative inefficiency. Because each of these methods has advantages and disadvantages, it is recommended to estimate efficiency using more than one method. An important step in efficiency analysis is the definition of inputs and outputs and their prices. Insurer inputs can be classified into three principal groups: labor, business services and materials, and capital. Three principal approaches have been used to measure outputs in the financial services sector: the asset or intermediation approach, the user-cost approach, and the value-added approach. The asset approach treats firms as pure financial intermediaries and would be inappropriate for insurers because they provide other services. The user-cost method determines whether a financial product is an input or output based on its net contribution to the revenues of the firm. This method requires precise data on products, revenues and opportunity costs which are difficult to estimate in insurance. The value-added approach is judged the most appropriate method for studying insurance efficiency. it considers all asset and liability categories to have some output characteristics rather than distinguishing inputs from outputs. In order to measure efficiency in the insurance industry in which outputs are mostly intangible, measurable services must be defined. The three principal services provided by insurance companies are risk pooling and risk-bearing, "real" financial services relating to insured losses, and intermediation. The authors discuss how these services can be measured as outputs in value-added analysis. They then summarize existing efficiency literature.
The structure, conduct, and regulation of the property-liability insurance industry
Insurance industry
- …