37,368 research outputs found
Gompertz: A Scilab Program for Estimating Gompertz Curve Using Gauss-Newton Method of Least Squares
A computer program for estimating Gompertz curve using Gauss-Newton method of least squares is described in detail. It is based on the estimation technique proposed in Reddy (1985). The program is developed using Scilab (version 3.1.1), a freely available scientific software package that can be downloaded from http://www.scilab.org/. Data is to be fed into the program from an external disk file which should be in Microsoft Excel format. The output will contain sample size, tolerance limit, a list of initial as well as the final estimate of the parameters, standard errors, value of Gauss-Normal equations namely GN_1 GN_2 and GN_3, No. of iterations, variance(sigma^2), Durbin-Watson statistic, goodness of fit measures such as R^2, D value, covariance matrix and residuals. It also displays a graphical output of the estimated curve vis a vis the observed curve. It is an improved version of the program proposed in Dastidar (2005).
Capital Requirements for Latin American Banks in Relation to their Market Risks: The Relevance of the Basle 1996 Amendment to Latin America
Banks` market or `trading` risks have increased noticeably over the past years, largely as a result of the growth of liquid assets on banks` balance sheets and the increase in banks` off-balance sheet activities. Well-publicized bank failures and significant capital losses have focussed further attention on these developments. In January 1996, the Basle Committee recommended the imposition of capital charges related to banks` trading risks, and the European Community`s Capital Adequacy Directive (CAD) came into force on January 1st, adopting, in part, the Basle Amendment. The G10 countries are committed to full implementation of these recommendations by the end of 1997. This paper reviews the main features of the Basle Amendment, which allows banks a choice between a `standardized methodology` and the use of their own internal models, subject to the authorization of the relevant supervisor and a set of parameter values. The relevance of this regulation for Latin America is analysed in the light of the region`s characteristics. We suggest that these characteristics increase rather than diminish the importance of the implementation of market risk capital requirements in Latin America.
Uncertainties of predictions from parton distribution functions II: the Hessian method
We develop a general method to quantify the uncertainties of parton
distribution functions and their physical predictions, with emphasis on
incorporating all relevant experimental constraints. The method uses the
Hessian formalism to study an effective chi-squared function that quantifies
the fit between theory and experiment. Key ingredients are a recently developed
iterative procedure to calculate the Hessian matrix in the difficult global
analysis environment, and the use of parameters defined as components along
appropriately normalized eigenvectors. The result is a set of 2d Eigenvector
Basis parton distributions (where d=16 is the number of parton parameters) from
which the uncertainty on any physical quantity due to the uncertainty in parton
distributions can be calculated. We illustrate the method by applying it to
calculate uncertainties of gluon and quark distribution functions, W boson
rapidity distributions, and the correlation between W and Z production cross
sections.Comment: 30 pages, Latex. Reference added. Normalization of Hessian matrix
changed to HEP standar
Structure of Defective Crystals at Finite Temperatures: A Quasi-Harmonic Lattice Dynamics Approach
In this paper we extend the classical method of lattice dynamics to defective
crystals with partial symmetries. We start by a nominal defect configuration
and first relax it statically. Having the static equilibrium configuration, we
use a quasiharmonic lattice dynamics approach to approximate the free energy.
Finally, the defect structure at a finite temperature is obtained by minimizing
the approximate Helmholtz free energy. For higher temperatures we take the
relaxed configuration at a lower temperature as the reference configuration.
This method can be used to semi-analytically study the structure of defects at
low but non-zero temperatures, where molecular dynamics cannot be used. As an
example, we obtain the finite temperature structure of two 180^o domain walls
in a 2-D lattice of interacting dipoles. We dynamically relax both the position
and polarization vectors. In particular, we show that increasing temperature
the domain wall thicknesses increase
An Extended Empirical Saddlepoint Approximation for Intractable Likelihoods
The challenges posed by complex stochastic models used in computational
ecology, biology and genetics have stimulated the development of approximate
approaches to statistical inference. Here we focus on Synthetic Likelihood
(SL), a procedure that reduces the observed and simulated data to a set of
summary statistics, and quantifies the discrepancy between them through a
synthetic likelihood function. SL requires little tuning, but it relies on the
approximate normality of the summary statistics. We relax this assumption by
proposing a novel, more flexible, density estimator: the Extended Empirical
Saddlepoint approximation. In addition to proving the consistency of SL, under
either the new or the Gaussian density estimator, we illustrate the method
using two examples. One of these is a complex individual-based forest model for
which SL offers one of the few practical possibilities for statistical
inference. The examples show that the new density estimator is able to capture
large departures from normality, while being scalable to high dimensions, and
this in turn leads to more accurate parameter estimates, relative to the
Gaussian alternative. The new density estimator is implemented by the esaddle R
package, which can be found on the Comprehensive R Archive Network (CRAN)
Feature extraction using extrema sampling of discrete derivatives for spike sorting in implantable upper-limb neural prostheses
Next generation neural interfaces for upper-limb (and other) prostheses aim to develop implantable interfaces for one or more nerves, each interface having many neural signal channels that work reliably in the stump without harming the nerves. To achieve real-time multi-channel processing it is important to integrate spike sorting on-chip to overcome limitations in transmission bandwidth. This requires computationally efficient algorithms for feature extraction and clustering suitable for low-power hardware implementation. This paper describes a new feature extraction method for real-time spike sorting based on extrema analysis (namely positive peaks and negative peaks) of spike shapes and their discrete derivatives at different frequency bands. Employing simulation across different datasets, the accuracy and computational complexity of the proposed method are assessed and compared with other methods. The average classification accuracy of the proposed method in conjunction with online sorting (O-Sort) is 91.6%, outperforming all the other methods tested with the O-Sort clustering algorithm. The proposed method offers a better tradeoff between classification error and computational complexity, making it a particularly strong choice for on-chip spike sorting
- âŠ