391 research outputs found

    The wavelength dependence of polarization in NGC 2023

    Get PDF
    NGC 2023 is a bright reflection nebula illuminated by the central star HD37903. At 2 microns the nebula is seen solely by reflected light from the central star but in the NIR there is excess radiation that is supposed to arise from thermal emission from a population of small grains (Sellgren, 1984). The unexpectedly high surface brightness at R and I wavelengths has led to the suggestion that even at these wavelengths there is a significant contribution from this thermal emission process (Witt, Schild, and Kraiman, 1984). If the nebula is seen by reflected starlight then this radiation will be linearly polarized. The level of polarization depends on the scattering geometry, grain size distribution, etc., and is typically 20 to 40 percent for nebulae such as NGC 1999 which is morphologically similar to NGC 2023. If, in any waveband, there is a contribution of radiation from emission processes this radiation will be unpolarized and will serve to dilute the scattered radiation to give a lower level of observed polarization. A study of the wavelength dependence of polarization in nebulae in which there may be thermal emission from grains will indicate the contribution from this process to the total luminosity. Polarization maps were produced in BVRI wavebands for the NGC 2023 nebulosity which confirm that at all wavelengths it is a reflection nebula illuminated by a central star. The wavelength dependence of polarization at representative points in the nebula and in a scatter plot of polarization in V and I wavebands at all points at which measurements are given. Results indicate that throughout the nebula there is a general trend for the level of polarization to increase with wavelength and that maximum levels of polarization occur at the longest wavelengths. No evidence is seen in the data for any significant contribution from the thermal emission from grains in the BVRI luminosity of NGC 2023

    A comparison of Spillover Effects before, during and after the 2008 Financial Crisis

    Get PDF
    This paper applies graphical modelling to the S&P 500, Nikkei 225 and FTSE 100 stock market indices to trace the spillover of returns and volatility between these three major world stock market indices before, during and after the 2008 financial crisis. We find that the depth of market integration changed significantly between the pre-crisis period and the crisis and post- crisis period. Graphical models of both return and volatility spillovers are presented for each period. We conclude that graphical models are a useful tool in the analysis of multivariate time series where tracing the flow of causality is important.Volatility spillover; graphical modelling; financial crisis; causality

    Experiments with a novel CCD stellar polarimeter

    Get PDF
    Experiments and observations have been undertaken with "bread-board" equipment to explore the potential of a "ring" stellar polarimeter with a CCD camera, rather than photographic plates used in Treanor's (1968) original instrument. By spreading the polarimetric signal over a large number of pixels on the detector, design prediction suggests that the polarimetric accuracy could be ~Δρ±0.00001 or ±0.001% per frame or even better. Although the photon accumulations suggest that this was achieved, instabilities in the employed crude modulator system provided frame to frame measurements with a greater than expected scatter. Software was developed to reduce the data in a simple way. With a design using more professional components and perhaps with more sophisticated reduction procedures, the full potential of the method should be achievable with the prospect of high precision polarimetry of the brighter stars. As an experimental bonus, the employed CCD chip was found to be free from any measurable polarizational sensitivity

    Extreme Value GARCH modelling with Bayesian Inference

    Get PDF
    Extreme value theory is widely used financial applications such as risk analysis, forecasting and pricing models. One of the major difficulties in the applications to finance and economics is that the assumption of independence of time series observations is generally not satisfied, so that the dependent extremes may not necessarily be in the domain of attraction of the classical generalised extreme value distribution. This study examines a conditional extreme value distribution with the added specification that the extreme values (maxima or minima) follows a conditional autoregressive heteroscedasticity process. The dependence has been modelled by allowing the location and scale parameters of the extreme distribution to vary with time. The resulting combined model, GEV-GARCH, is developed by implementing the GARCH volatility mechanism in these extreme value model parameters. Bayesian inference is used for the estimation of parameters and posterior inference is available through the Markov Chain Monte Carlo (MCMC) method. The model is firstly applied to relevant simulated data to verify model stability and reliability of the parameter estimation method. Then real stock returns are used to consider evidence for the appropriate application of the model. A comparison is made between the GEV-GARCH and traditional GARCH models. Both the GEV-GARCH and GARCH show similarity in the resulting conditional volatility estimates, however the GEV-GARCH model differs from GARCH in that it can capture and explain extreme quantiles better than the GARCH model because of more reliable extrapolation of the tail behaviour.Extreme value distribution, dependency, Bayesian, MCMC, Return quantile

    Bayesian Extreme Value Mixture Modelling for Estimating VaR

    Get PDF
    A new extreme value mixture modelling approach for estimating Value-at-Risk (VaR) is proposed, overcoming the key issues of determining the threshold which defines the distribution tail and accounts for uncertainty due to threshold choice. A two-stage approach is adopted: volatility estimation followed by conditional extremal modelling of the independent innovations. Bayesian inference is used to account for all uncertainties and enables inclusion of expert prior information, potentially overcoming the inherent sparsity of extremal data. Simulations show the reliability and flexibility of the proposed mixture model, followed by VaR forecasting for capturing returns during the current financial crisis.Extreme values; Bayesian; Threshold estimation; Value-at-Risk

    The discovery of a highly polarized bipolar nebula

    Get PDF
    During a search for the optical counterparts of IRAS sources whose flux peaks at 25 microns, a small faint bipolar nebula was discovered in Monoceros at the position of IRAS 07131-0147. The CCD images display the object's considerable structure. The central star seems relatively free of closeby nebulosity: the two lobes have a bow-tie structure with those parts nearest to the star consisting of series of small knots. The outer parts of the lobes seem to be made up of filaments streaming away from knots. On the basis of its optical spectrum, the central star was classified as a M5-6 giant. In the IRAS color classification scheme of Van der Veen and Habing (1988), the central star is VIb which indicates that there are distinct hot and cold components of circumstellar dust and that the mass loss process may have temporarily abated. Therefore, it is proposed that the object is in the post main sequence stage of evolution and is a protoplanetary nebulae. Young protoplanetary nebulae have totally obscured central stars illuminating reflective lobes whereas older ones such as M2-9 have lobes seen in emission from gas ionized by the central hot star which is clearly visible. Since the central object of IRAS07131-0147 is a relatively unobscured late type star and the lobes are seen only by reflection, it is suggested that this nebula is a protoplanetary nebula in an evolutionary stage intermediate between that of CRL2688 and M2-9

    Optical Spectropolarimetry and Asphericity of Type Ic SN 2007gr

    Full text link
    We present optical spectropolarimetric observations of Type Ic supernova (SN) 2007gr with Subaru telescope at 21 days after the maximum brightness (~37 days after the explosion). Non-zero polarization as high as ~3% is observed at the absorption feature of Ca II IR triplet. The polarization of the continuum light is ~0.5% if we estimate the interstellar polarization (ISP) component assuming that the continuum polarization has a single polarization angle. This suggests that the axis ratio of the SN photosphere projected to the sky is different from unity by ~10%. The polarization angle at the Ca II absorption is almost aligned to that of the continuum light. These features may be understood by the model where a bipolar explosion with an oblate photosphere is viewed from the slightly off-axis direction and explosively synthesized Ca near the polar region obscures the light originated around the minor axis of the SN photosphere. Given the uncertainty of the ISP, however, the polarization data could also be interpreted by the model with an almost spherically symmetric photosphere and a clumpy Ca II distribution.Comment: 9 pages, 8 figures, Accepted for publication in the Astrophysical Journa

    Explorations of knowledge management in a defence engineering environment

    Get PDF
    This thesis originates from first hand early experiences of the researcher regarding current processes and practices in operation in BAE SYSTEMS Ltd (now referred to hereafter as `the Company'), and recognises the potential for improvement within the realm of knowledge management. The huge volume of internal and external information overwhelms the majority of organisations and knowledge management provides solutions to enable organisations to be effective, efficient, and competitive. The software agent approach and information retrieval technique indicates great potential for effectively managing information. This research seeks to answer the questions of whether software agents can provide the Company with solutions to the knowledge management issues identified in this inquiry and whether they can also be used elsewhere within the organisation to improve other aspects of the business. The research analysis shows that software agents offer a wide applicability across the Company; can be created with relative ease and can provide benefits by improving the effectiveness and efficiency of processes. Findings also provided valuable insight into human-computer-interface design and usability aspects of software agent applications. The research deals with these questions using action research in order to develop a collaborative change mechanism within the Company and a practical applicability of the research findings in situ. Using a pluralistic methodology the findings provide a combination of the subjective and objective views intermittently within the research cycles thereby giving the researchera more holistic view of this research. Little attention has been paid to integrating software agent technologies into the knowledge management processes.This research proposes a software agent application that incorporates: (1) Co-ordination of software agents for information retrieval to manage information gathering, filtering, and dissemination; (2) To promote effective interpretation of information and more efficient processes;(3) Building accurate search profiles weighted on pre-defined criteria; (4) Integrating and organising a Company resource management knowledge-base; (5) Ensuring that the right information gets to the right personnel at the right time; and (6) So the Company can effectively assign the right experts to the right roles within the Company

    Extreme Value GARCH modelling with Bayesian Inference

    Get PDF
    RePEC Working Paper Series No: 05/2009Extreme value theory is widely used financial applications such as risk analysis, forecasting and pricing models. One of the major difficulties in the applications to finance and economics is that the assumption of independence of time series observations is generally not satisfied, so that the dependent extremes may not necessarily be in the domain of attraction of the classical generalised extreme value distribution. This study examines a conditional extreme value distribution with the added specification that the extreme values (maxima or minima) follows a conditional autoregressive heteroscedasticity process. The dependence has been modelled by allowing the location and scale parameters of the extreme distribution to vary with time. The resulting combined model, GEV-GARCH, is developed by implementing the GARCH volatility mechanism in these extreme value model parameters. Bayesian inference is used for the estimation of parameters and posterior inference is available through the Markov Chain Monte Carlo (MCMC) method. The model is firstly applied to relevant simulated data to verify model stability and reliability of the parameter estimation method. Then real stock returns are used to consider evidence for the appropriate application of the model. A comparison is made between the GEV-GARCH and traditional GARCH models. Both the GEV-GARCH and GARCH show similarity in the resulting conditional volatility estimates, however the GEV-GARCH model differs from GARCH in that it can capture and explain extreme quantiles better than the GARCH model because of more reliable extrapolation of the tail behaviour
    corecore