721 research outputs found

    An Essay on the Double Nature of the Probability

    Full text link
    Classical statistics and Bayesian statistics refer to the frequentist and subjective theories of probability respectively. Von Mises and De Finetti, who authored those conceptualizations, provide interpretations of the probability that appear incompatible. This discrepancy raises ample debates and the foundations of the probability calculus emerge as a tricky, open issue so far. Instead of developing philosophical discussion, this research resorts to analytical and mathematical methods. We present two theorems that sustain the validity of both the frequentist and the subjective views on the probability. Secondly we show how the double facets of the probability turn out to be consistent within the present logical frame

    Estimation of Solute Fluxes from Ungaged Headwater Catchments in the Catskill Park of New York State

    Get PDF
    Predictions of flow and subsequent solute fluxes from ungaged basins have important implications both for water resources management and ecosystem monitoring studies. The Catskill region of New York State is one such place that requires both water resources management and ecosystem monitoring due to its strategic location as the main water-supplying region for New York City. This study examines the differences in chemical mass flux estimates made in ungaged basins using three different chemistry aggregation methods for solute concentrations determined from monthly grab samples. The efficacy of area ratios for predicting flow at the upstream location of a nested pair of stream gages based on flow at the downstream reference gage is also explored. The benefit of data set partitioning and development of separate prediction models for different flow regimes of the reference gage is analyzed, and a threshold of area ratio for use of such a method is established, with implications for use in ungaged basins. This work is focused on the Catskill region, but is likely to be applicable to other temperate, montane systems. Significant relationships were observed between upstream and downstream flow in all test watersheds. Furthermore, watershed area ratio was the most important basin parameter for estimating flow at the upstream location of a nested pair of stream gages. The area ratio alone explained 93% of the variance in the functional relation slopes that best fit the flow regressions. Data set partitioning was found to be beneficial only for nested pairs with area ratios greater than 0.1, and was determined by analysis of the root mean square error of the different flow prediction models. Five of the fifteen test watershed pairs had a lower root mean square error using the partitioned relationships and these pairs all had area ratios greater than 0.1. The relative difference between the three different chemistry aggregation methods was found to be relatively small on an annual basis (average difference of 7%) and increase with shorter time steps up to daily flux estimates (average difference of 26%). This finding indicates that simple flow estimation methods based on area ratios are justifiable, and perhaps preferred, for estimation of annual chemical mass fluxes, and that for such estimates of flux, the exact solute chemistry aggregation method matters little on an annual basis

    Rating and Pricing: State of the art for the proposal of new methodologies

    Get PDF
    Paraphrasing a sentence from the Nobel Prize Robert Merton, without the Risk we would have not need of Finance; Economy would be sufficient to describe and manage financial transactions across agents and Countries . The ability of transferring the risk expresses the main aim of Finance; a correct evaluation of the counterparty risk is one of the main goal of credit risk theory. Financial crisis remark this necessity: if a financial contract such as a Credit Default Swap, which, by definition, is a tool to protect against the default risk of a counterpart, is traded between banks and insurance companies that can not exactly evaluate the financial position of customers, the effects on global Equilibrium and the loss of value can be severe. The question of pricing methodologies is strictly linked to the previous topic: is rating sufficient to catch all price fluctuations? What are the advantages and disadvantages of nowadays pricing methodologies? In which directions can them be switched or improved? Mathematics can give an answer to us? How information underlying market risk can be allowed into a Mathematical model of pricing in order not to affect a correct credit risk evaluation? These are the questions this article will answer

    Neuromedin U pathway in the control of obesity and other hypothalamus-regulated phenotypes

    Get PDF
    This is a candidate pathway study in European children and adults. Neuromedin U (NMU) is a hypothalamic neuropeptide that regulates metabolic phenotypes. Our preliminary analyses in European children suggested that NMU gene plays an important role in adiposity and bone health. This project aims at investigating the associations between hypothalamus-regulated phenotypes and NMU pathway genes, by: (i) investigating in children and confirming in adults possible associations between polymorphisms in NMU pathway genes and adiposity, insulin resistance, blood pressure and bone health; (ii) verifying gene-gene interaction effects; (iii) identifying specific rare loci or regions with aberrant methylation in the genes confirmed for associations. Two populations of children and adults will be used. A two-step approach will be set-up to identify and replicate associations, considering false discovery rate correction. The results will be useful to identify potential target for novel drugs and to recognize subjects at high risk for metabolic and bone diseases

    Does corporate control matter to financial volatility?

    Get PDF
    In our contribution we study how the ownership channel affects the stock price volatility of listed stock markets. In particular, we study how a linkage between a parent company and its affiliates may drive differences in stock price volatility, within and across countries. We exploit a worldwide dataset of stock-exchange listed firms, controlling for several financial dimensions, to assess whether business groups matter to financial volatility. The answer is positive and does not depend on the definition of volatility used. Our results contribute to the corporate finance literature by defining the role of multinational corporate control in financial markets, and to the financial stability literature by assessing corporate control as an undiscovered channel of transmission for financial shocks

    MODELLI PER LA VALUTAZIONE DI TITOLI SOGGETTI A FALLIMENTO

    Get PDF
    La presente trattazione costituisce la tesi magistrale dell'autrice, a completamento del suo curriculum di studi ad indirizzo analitico-probabilistico. Lo scopo della tesi e' fornire modelli matematici per la valutazione di titoli finanziari ad interesse fisso ma soggetti a fallimento. Il divario tra titoli statali ritenuti sicuri, che per l'Eurozona sono i bund tedeschi, ed altri meno sicuri, esprime l'idea che l'interesse maggiore dovrebbe compensare il rischio di acquistare un titolo piu' esposto ad insolvenza. Resta l'interesse concreto,da parte delle Agenzie di Rating e dei grandi investitori, di valutare quanto vale questo rischio, sulla base non solo di dati statistici ma anche di modelli di evoluzione dell'andamento di questi dati. La metodologia utilizzata e' stata l'analisi dell'evoluzione della teoria matematica nei due maggiori approcci al problema, costituiti dai modelli strutturali e dai modelli in forma ridotta. Mentre i primi si basano su una descrizione endogena della possibilita' di insolvenza, che si verifica al raggiungimento di una certa soglia minima del valore degli attivi, i secondi considerano completamente esogeno, e quindi imprevedibile, l'evento di fallimento. Riveste particolare importanza il parametro della cosiddetta intensita' del tempo di fallimento. Lo studio svolto ha evidenziato l'insufficienza della prima tipologia di modelli per l'obbiettivo prefigurato. Il motivo e' duplice: da una parte, essi si servono di una descrizione totale del flusso di cassa dell'emittente, mentre, come e' noto, gli istituti di credito possono tenere nascosta parte dei loro bilanci; dall'altra, questi modelli non tengono conto della possibilita' di fallimento per fattori esterni. Sono stati elaborati modelli basati sull'intensita', sfruttando le equazioni differenziali stocastiche di tipo Cox-Ingersoll-Ross, che hanno permesso la valutazione delle probabilita' di insolvenza nonche' la determinazione del prezzo equo di opzioni di tipo 'call' esposte a rischio di mancata copertura

    New constraints for heavy axion-like particles from supernovae

    Full text link
    We derive new constraints on the coupling of heavy pseudoscalar (axion-like) particles to photons, based on the gamma ray flux expected from the decay of these particles into photons. After being produced in the supernova core, these heavy axion-like particles would escape and a fraction of them would decay into photons before reaching the Earth. We have calculated the expected flux on Earth of these photons from the supernovae SN 1987A and Cassiopeia A and compared our results to data from the Fermi Large Area Telescope. This analysis provides strong constraints on the parameter space for axion-like particles. For a particle mass of 100 MeV, we find that the Peccei-Quinn constant, f_a, must be greater than about 10^{15} GeV. Alternatively, for fa=10^{12} GeV, we exclude the mass region between approximately 100 eV and 1 GeV.Comment: 14 pages, 4 figures. Version published in JCAP. Major changes in the exposition. Added a figure. Added appendix. Minor changes in the results. Some changes in the bibliograph

    TEACHERS’ PERCEPTIONS OF TECHNOLOGY EFFECTIVENESS IN HIGH SCHOOL

    Get PDF
    The purpose of this study was to examine high school teachers’ perceptions of technology in the classroom, including technology access, usage, and effectiveness. This study was conducted by administering a survey to high school teachers, Grades 9 through 12, in Indiana. The survey, entitled Teachers’ Perceptions of Technology Effectiveness in High Schools was used with high school teachers’ to determine their perceptions of technology access, usage, and effectiveness in classrooms. A total of 343 teachers submitted complete responses to the Teachers’ Perceptions of Technology Effectiveness in High Schools. I developed a survey to quantitatively measure the perceptions of teachers on current technology usage patterns in the state of Indiana. Data were analyzed using a Pearson correlation test, a one-way ANOVA test, and a multiple regression test. The data analysis showed a significant correlation between teacher software and equipment utilization with perceived effectiveness. Also, significant differences were noted in teachers’ perceptions and usage of technology based on age. Last, significant differences were found in perceptions and usage of technology based on teaching position. Based on the above results the following conclusion was proposed: An effective professional development or training program should be implemented for teachers when implementing technology. School corporations need to offer a comprehensive program over a period of time in order for teachers to acclimate themselves to various capabilities of said technology. Within this comprehensive program, there would also be time for on-going professional development, time to collaborate with peers, administrative support, reflection and goal setting, and even additional summer opportunities for further learning
    • …
    corecore