12,217 research outputs found
The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model
How to quantify the impact of a researcher's or an institution's body of work
is a matter of increasing importance to scientists, funding agencies, and
hiring committees. The use of bibliometric indicators, such as the h-index or
the Journal Impact Factor, have become widespread despite their known
limitations. We argue that most existing bibliometric indicators are
inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we
pursue a principled approach to the development of an indicator to quantify the
scientific impact of both individual researchers and research institutions
grounded on the functional form of the distribution of the asymptotic number of
citations. We validate our approach using the publication records of 1,283
researchers from seven scientific and engineering disciplines and the chemistry
departments at the 106 U.S. research institutions classified as "very high
research activity". Our approach has three distinct advantages. First, it
accurately captures the overall scientific impact of researchers at all career
stages, as measured by asymptotic citation counts. Second, unlike other
measures, our indicator is resistant to manipulation and rewards publication
quality over quantity. Third, our approach captures the time-evolution of the
scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table
What Happened to Risk Management During the 2008-09 Financial Crisis?
When dealing with market risk under the Basel II Accord, variation pays in the form of lower capital requirements and higher profits. Typically, GARCH type models are chosen to forecast Value-at-Risk (VaR) using a single risk model. In this paper we illustrate two useful variations to the standard mechanism for choosing forecasts, namely: (i) combining different forecast models for each period, such as a daily model that forecasts the supremum or infinum value for the VaR; (ii) alternatively, select a single model to forecast VaR, and then modify the daily forecast, depending on the recent history of violations under the Basel II Accord. We illustrate these points using the Standard and Poor’s 500 Composite Index. In many cases we find significant decreases in the capital requirements, while incurring a number of violations that stays within the Basel II Accord limits.risk management;violations;conservative risk strategy;aggressive risk strategy;value-at-risk forecast
GFC-Robust Risk Management Strategies under the Basel Accord
A risk management strategy is proposed as being robust to the Global Financial Crisis (GFC) by selecting a Value-at-Risk (VaR) forecast that combines the forecasts of different VaR models. The robust forecast is based on the median of the point VaR forecasts of a set of conditional volatility models. This risk management strategy is GFC-robust in the sense that maintaining the same risk management strategies before, during and after a financial crisis would lead to comparatively low daily capital charges and violation penalties. The new method is illustrated by using the S&P500 index before, during and after the 2008-09 global financial crisis. We investigate the performance of a variety of single and combined VaR forecasts in terms of daily capital requirements and violation penalties under the Basel II Accord, as well as other criteria. The median VaR risk management strategy is GFC-robust as it provides stable results across different periods relative to other VaR forecasting models. The new strategy based on combined forecasts of single models is straightforward to incorporate into existing computer software packages that are used by banks and other financial institutions.Value-at-Risk (VaR);daily capital charges;optimizing strategy;robust forecasts;violation penalties;global financial crisis;Basel II Accord;aggressive risk management strategy;conservative risk management strategy
A decision rule to minimize daily capital charges in forecasting value-at-risk
Under the Basel II Accord, banks and other Authorized Deposit-taking Institutions (ADIs) have to communicate their daily risk estimates to the monetary authorities at the beginning of the trading day, using a variety of Value-at-Risk (VaR) models to measure risk. Sometimes the risk estimates communicated using these models are too high, thereby leading to large capital requirements and high capital costs. At other times, the risk estimates are too low, leading to excessive violations, so that realised losses are above the estimated risk. In this paper we propose a learning strategy that complements existing methods for calculating VaR and lowers daily capital requirements, while restricting the number of endogenous violations within the Basel II Accord penalty limits. We suggest a decision rule that responds to violations in a discrete and instantaneous manner, while adapting more slowly in periods of no violations. We apply the proposed strategy to Standard & Poor’s 500 Index and show there can be substantial savings in daily capital charges, while restricting the number of violations to within the Basel II penalty limits.value-at-risk;daily capital charges;optimizing strategy;risk forecasts;endogenous violations;frequency of violations
Orbital Characteristics of the Subdwarf-B and F V Star Binary EC~20117-4014(=V4640 Sgr)
Among the competing evolution theories for subdwarf-B (sdB) stars is the
binary evolution scenario. EC~20117-4014 (=V4640~Sgr) is a spectroscopic binary
system consisting of a pulsating sdB star and a late F main-sequence companion
(O'Donoghue et al. 1997), however the period and the orbit semi-major axes have
not been precisely determined. This paper presents orbital characteristics of
the EC 20117-4014 binary system using 20 years of photometric data. Periodic
Observed minus Calculated (O-C) variations were detected in the two highest
amplitude pulsations identified in the EC 20117-4014 power spectrum, indicating
the binary system's precise orbital period (P = 792.3 days) and the
light-travel time amplitude (A = 468.9 s). This binary shows no significant
orbital eccentricity and the upper limit of the eccentricity is 0.025 (using 3
as an upper limit). This upper limit of the eccentricity is the lowest
among all wide sdB binaries with known orbital parameters. This analysis
indicated that the sdB is likely to have lost its hydrogen envelope through
stable Roche lobe overflow, thus supporting hypotheses for the origin of sdB
stars. In addition to those results, the underlying pulsation period change
obtained from the photometric data was = 5.4 (0.7)
d d, which shows that the sdB is just before the end of the
core helium-burning phase
Dynamics of Surface Roughening with Quenched Disorder
We study the dynamical exponent for the directed percolation depinning
(DPD) class of models for surface roughening in the presence of quenched
disorder. We argue that for dimensions is equal to the exponent
characterizing the shortest path between two sites in an
isotropic percolation cluster in dimensions. To test the argument, we
perform simulations and calculate for DPD, and for
percolation, from to .Comment: RevTex manuscript 3 pages + 6 figures (obtained upon request via
email [email protected]
Prediction of cattle density and location at the frontier of Brazil and Paraguay using remote sensing
In this paper, we explore the potential of remote sensing to map pastures areas and by this way establish models for predicting cattle density and location. First, an object based classification (OB) was made in Landsat 5 images for three different municipalities to provide a land-cover map. Second, on the basis of Brazilian official livestock database, a statistical model to predict number of cattle in function of declared pasture area by the farmers was produced. Finally, this model was applied to the pasture areas detected by remote sensing to predict cattle density. Coefficient of determination of the model was 0.63. The results indicate that the methodology used for estimating cattle density has a potential to be applied in regions where no information about farm location and cattle density exists. (Résumé d'auteur
- …