370 research outputs found

    Essays in Quantitative Risk Management for Financial Regulation of Operational Risk Models

    Get PDF
    An extensive amount of evolving guidance and rules are provided to banks by financial regulators. A particular set of instructions outline requirements to calculate and set aside loss-absorbing regulatory capital to ensure the solvency of a bank. Mathematical models are typically used by banks to quantify sufficient amounts of capital. In this thesis, we explore areas that advance our knowledge in regulatory risk management. In the first essay, we explore an aspect of operational risk loss modeling using scenario analysis. An actuarial modeling method is typically used to quantify a baseline capital value which is then layered with a judgemental component in order to account for and integrate what-if future potential losses into the model. We propose a method from digital signal processing using the convolution operator that views the problem of the blending of two signals. That is, a baseline loss distribution obtained from the modeling of frequency and severity of internal losses is combined with a probability distribution obtained from scenario responses to yield a final output that integrates both sets of information. In the second essay, we revisit scenario analysis and the potential impact of catastrophic events to that of the enterprise level of a bank. We generalize an algorithm to account for multiple level of intensities of events together with unique loss profiles depending on the business units effected. In the third essay, we investigate the problem of allocating aggregate capital across sub-portfolios in a fair manner when there are various forms of interdependencies. Relevant to areas of market, credit and operational risk, the multivariate shortfall allocation problem quantifies the optimal amount of capital needed to ensure that the expected loss under a convex loss penalty function remains bounded by a threshold. We first provide an application of the existing methodology to a subset of high frequency loss cells. Lastly, we provide an extension using copula models which allows for the modeling of joint fat-tailed events or asymmetries in the underlying process

    Vedel-objektiiv abil salvestatud kaugseire piltide analüüs kasutades super-resolutsiooni meetodeid

    Get PDF
    Väitekirja elektrooniline versioon ei sisalda publikatsiooneKäesolevas doktoritöös uuriti nii riist- kui ka tarkvaralisi lahendusi piltide töötlemiseks. Riist¬varalise poole pealt pakuti lahenduseks uudset vedelläätse, milles on dielekt¬rilisest elastomeerist kihilise täituriga membraan otse optilisel teljel. Doktoritöö käigus arendati välja kaks prototüüpi kahe erineva dielektrilisest elastomeerist ki¬hilise täituriga, mille aktiivne ala oli ühel juhul 40 ja teisel 20 mm. Läätse töö vas¬tas elastomeeri deformatsiooni mehaanikale ja suhtelistele muutustele fookuskau¬guses. Muutuste demonstreerimiseks meniskis ja läätse fookuskauguse mõõtmiseks kasutati laserkiirt. Katseandmetest selgub, et muutuste tekitamiseks on vajalik pinge vahemikus 50 kuni 750 volti. Tarkvaralise poole pealt pakuti uut satelliitpiltide parandamise süsteemi. Paku¬tud süsteem jagas mürase sisendpildi DT-CWT laineteisenduse abil mitmeteks sagedusalamribadeks. Pärast müra eemaldamist LA-BSF funktsiooni abil suu¬rendati pildi resolutsiooni DWT-ga ja kõrgsagedusliku alamriba piltide interpo¬leerimisega. Interpoleerimise faktor algsele pildile oli pool sellest, mida kasutati kõrgsagedusliku alamriba piltide interpoleerimisel ning superresolutsiooniga pilt rekonst¬rueeriti IDWT abil. Käesolevas doktoritöös pakuti tarkvaraliseks lahenduseks uudset sõnastiku baasil töötavat super-resolutsiooni (SR) meetodit, milles luuakse paarid suure resolutsiooniga (HR) ja madala resolut-siooniga (LR) piltidest. Kõigepealt jagati vastava sõnastiku loomiseks HR ja LR paarid omakorda osadeks. Esialgse HR kujutise saamiseks LR sisendpildist kombineeriti HR osi. HR osad valiti sõnastikust nii, et neile vastavad LR osad oleksid võimalikult lähedased sisendiks olevale LR pil¬dile. Iga valitud HR osa heledust korrigeeriti, et vähendada kõrvuti asuvate osade heleduse erine¬vusi superresolutsiooniga pildil. Plokkide efekti vähendamiseks ar¬vutati saadud SR pildi keskmine ning bikuupinterpolatsiooni pilt. Lisaks pakuti käesolevas doktoritöös välja kernelid, mille tulemusel on võimalik saadud SR pilte teravamaks muuta. Pakutud kernelite tõhususe tõestamiseks kasutati [83] ja [50] poolt pakutud resolutsiooni parandamise meetodeid. Superreso¬lutsiooniga pilt saadi iga kerneli tehtud HR pildi kombineerimise teel alpha blen¬dingu meetodit kasutades. Pakutud meetodeid ja kerneleid võrreldi erinevate tavaliste ja kaasaegsete meetoditega. Kvantita-tiivsetest katseandmetest ja saadud piltide kvaliteedi visuaal¬sest hindamisest selgus, et pakutud meetodid on tavaliste kaasaegsete meetoditega võrreldes paremad.In this thesis, a study of both hardware and software solutions for image enhance¬ment has been done. On the hardware side, a new liquid lens design with a DESA membrane located directly in the optical path has been demonstrated. Two pro¬totypes with two different DESA, which have a 40 and 20 mm active area in diameter, were developed. The lens performance was consistent with the mechan¬ics of elastomer deformation and relative focal length changes. A laser beam was used to show the change in the meniscus and to measure the focal length of the lens. The experimental results demonstrate that voltage in the range of 50 to 750 V is required to create change in the meniscus. On the software side, a new satellite image enhancement system was proposed. The proposed technique decomposed the noisy input image into various frequency subbands by using DT-CWT. After removing the noise by applying the LA-BSF technique, its resolution was enhanced by employing DWT and interpolating the high-frequency subband images. An original image was interpolated with half of the interpolation factor used for interpolating the high-frequency subband images, and the super-resolved image was reconstructed by using IDWT. A novel single-image SR method based on a generating dictionary from pairs of HR and their corresponding LR images was proposed. Firstly, HR and LR pairs were divided into patches in order to make HR and LR dictionaries respectively. The initial HR representation of an input LR image was calculated by combining the HR patches. These HR patches are chosen from the HR dictionary corre-sponding to the LR patches that have the closest distance to the patches of the in¬put LR image. Each selected HR patch was processed further by passing through an illumination enhancement processing order to reduce the noticeable change of illumination between neighbor patches in the super-resolved image. In order to reduce the blocking effect, the average of the obtained SR image and the bicubic interpolated image was calculated. The new kernels for sampling have also been proposed. The kernels can improve the SR by resulting in a sharper image. In order to demonstrate the effectiveness of the proposed kernels, the techniques from [83] and [50] for resolution enhance¬ment were adopted. The super-resolved image was achieved by combining the HR images produced by each of the proposed kernels using the alpha blending tech-nique. The proposed techniques and kernels are compared with various conventional and state-of-the-art techniques, and the quantitative test results and visual results on the final image quality show the superiority of the proposed techniques and ker¬nels over conventional and state-of-art technique

    Approximate Inference for Constructing Astronomical Catalogs from Images

    Full text link
    We present a new, fully generative model for constructing astronomical catalogs from optical telescope image sets. Each pixel intensity is treated as a random variable with parameters that depend on the latent properties of stars and galaxies. These latent properties are themselves modeled as random. We compare two procedures for posterior inference. One procedure is based on Markov chain Monte Carlo (MCMC) while the other is based on variational inference (VI). The MCMC procedure excels at quantifying uncertainty, while the VI procedure is 1000 times faster. On a supercomputer, the VI procedure efficiently uses 665,000 CPU cores to construct an astronomical catalog from 50 terabytes of images in 14.6 minutes, demonstrating the scaling characteristics necessary to construct catalogs for upcoming astronomical surveys.Comment: accepted to the Annals of Applied Statistic

    Glosarium Matematika

    Get PDF
    273 p.; 24 cm

    Annales Mathematicae et Informaticae 2015

    Get PDF

    Annales Mathematicae et Informaticae (45.)

    Get PDF

    Annales Mathematicae et Informaticae (36.)

    Get PDF

    Glosarium Matematika

    Get PDF

    Galaxy And Mass Assembly (GAMA): end of survey report and data release 2

    Get PDF
    The Galaxy And Mass Assembly (GAMA) survey is one of the largest contemporary spectroscopic surveys of low redshift galaxies. Covering an area of ˜286 deg2 (split among five survey regions) down to a limiting magnitude of r < 19.8 mag, we have collected spectra and reliable redshifts for 238 000 objects using the AAOmega spectrograph on the Anglo-Australian Telescope. In addition, we have assembled imaging data from a number of independent surveys in order to generate photometry spanning the wavelength range 1 nm-1 m. Here, we report on the recently completed spectroscopic survey and present a series of diagnostics to assess its final state and the quality of the redshift data. We also describe a number of survey aspects and procedures, or updates thereof, including changes to the input catalogue, redshifting and re-redshifting, and the derivation of ultraviolet, optical and near-infrared photometry. Finally, we present the second public release of GAMA data. In this release, we provide input catalogue and targeting information, spectra, redshifts, ultraviolet, optical and near-infrared photometry, single-component Sérsic fits, stellar masses, Hα-derived star formation rates, environment information, and group properties for all galaxies with r < 19.0 mag in two of our survey regions, and for all galaxies with r < 19.4 mag in a third region (72 225 objects in total). The data base serving these data is available at http://www.gama-survey.org/
    corecore