33 research outputs found

    BAYESIAN MODELLING OF ULTRA HIGH-FREQUENCY FINANCIAL DATA

    Get PDF
    The availability of ultra high-frequency (UHF) data on transactions has revolutionised data processing and statistical modelling techniques in finance. The unique characteristics of such data, e.g. discrete structure of price change, unequally spaced time intervals and multiple transactions have introduced new theoretical and computational challenges. In this study, we develop a Bayesian framework for modelling integer-valued variables to capture the fundamental properties of price change. We propose the application of the zero inflated Poisson difference (ZPD) distribution for modelling UHF data and assess the effect of covariates on the behaviour of price change. For this purpose, we present two modelling schemes; the first one is based on the analysis of the data after the market closes for the day and is referred to as off-line data processing. In this case, the Bayesian interpretation and analysis are undertaken using Markov chain Monte Carlo methods. The second modelling scheme introduces the dynamic ZPD model which is implemented through Sequential Monte Carlo methods (also known as particle filters). This procedure enables us to update our inference from data as new transactions take place and is known as online data processing. We apply our models to a set of FTSE100 index changes. Based on the probability integral transform, modified for the case of integer-valued random variables, we show that our models are capable of explaining well the observed distribution of price change. We then apply the deviance information criterion and introduce its sequential version for the purpose of model comparison for off-line and online modelling, respectively. Moreover, in order to add more flexibility to the tails of the ZPD distribution, we introduce the zero inflated generalised Poisson difference distribution and outline its possible application for modelling UHF data

    Modelling large-scale structures in the high-latitude ionosphere using 15 years of data from the EISCAT Svalbard Radar

    Get PDF
    The ionosphere is a highly complex plasma containing electron density structures with a wide range of spatial scale sizes. Large-scale structures with horizontal extents of tens to hundreds of km exhibit variation with time of day, season, solar cycle, geomagnetic activity, solar wind conditions, and location. Whilst the processes driving these large-scale structures are well understood, the relative importance of these driving processes is a fundamental, unanswered question. The large-scale structures can also cause smaller-scale irregularities that arise due to instability processes such as the gradient drift instability (GDI) and turbulence. These smaller scale structures can disrupt trans-ionospheric radio signals, including those used by Global Navigation Satellite Systems (GNSS). Statistical modelling techniques have been used to generate models of various measures of large-scale plasma structuring in the high-latitude ionosphere using 15 years of data gathered by the EISCAT Svalbard Radar. These models quantify the relative importance of the dominant driving processes in four time sectors (noon, dusk, midnight and dawn). In every sector the dominant process is the seasonal variation, and this difference is attributed to both the variation in the chemical composition of the atmosphere and the maintenance of the background ionosphere by photoionization in summer. Secondary processes vary with time sector, but include variations with the solar cycle, geomagnetic activity, and the strength, orientation and variation of the Interplanetary Magnetic Field. Geophysical variables are used as proxies for these physical processes. As data for the geophysical variables selected are available in real time, these models have the potential to make real time predictions of the amount of plasma structuring in the ionosphere for GNSS applications

    Identifying fine-scale archaeological features using KH-9 HEXAGON mapping and panoramic camera images : evidence from Liangzhu Ancient City

    Get PDF
    Historical fine-scale information of archaeological landscapes is crucial in archaeological investigations. However, documenting such information using satellite sensor data prior to 2000 remains a daunting challenge. Images from the declassified archives of KH-9 HEXAGON (KH-9) cameras, including the panoramic camera system (PCS) and mapping camera system (MCS), offer fine-scale information about archaeological sites. However, noise, contrast distortion and the availability of only a single panchromatic band can limit their potential, particularly for identifying features in subtropical climates within heterogeneous landscape types. This paper focuses on developing a novel multifaceted analytical framework with two components: image pre-processing and feature identification. The image pre-processing component is divided into two steps. First, a trained stationary wavelet transform (SWT) based on the normalized sill (NS) is developed to not only de-noises the image, but also preserve its original image characteristics. Then, the contrast of the de-noised images is optimized by the multi-resolution Top-hat (MTH) using multi-scale information. In the feature identification component, the MCS image is analysed using spatial colour composite write function memory (SCCWFM) and spatial novelty detection (SND). An ultra-fine spatial three-dimensional colour composite (UFSTCC) image and ultra-fine spatial digital surface model (UFSDSM) are produced to aid interpretation of the KH-9 PCS images. The proposed processing pipelines were tested on KH-9 MCS and PCS images of the World Heritage site at Liangzhu Ancient City (LAC) in China, which is characterized by a subtropical climate and a heterogeneous landscape types. The proposed pre-processing pipeline improved considerably the appearance of these images across the LAC landscape while maintaining the original image information. The developed digital analytical approaches for KH-9 PCS and MCS images facilitated straightforward identification of archaeological features in the LAC. The proposed framework has the potential to increase exploitation of the available KH-9 images in archaeological applications

    Bayesian estimation of incomplete data using conditionally specified priors

    Get PDF
    In this paper, a class of conjugate prior for estimating incomplete count data based on a broad class of conjugate prior distributions is presented. The new class of prior distributions arises from a conditional perspective, making use of the conditional specification methodology and can be considered as the generalisation of the form of prior distributions that have been used previously in the estimation of in- complete count data well. Finally, some examples of simulated and real data are given

    Variation in survival after surgery for peri-ampullary cancer in a regional cancer network

    Get PDF
    Background: Centralisation of specialist surgical services requires that patients are referred to a regional centre for surgery. This process may disadvantage patients who live far from the regional centre or are referred from other hospitals by making referral less likely and by delaying treatment, thereby allowing tumour progression. The aim of this study is to explore the outcome of surgery for peri-ampullary cancer (PC) with respect to referring hospital and travel distance for treatment within a network served by five hospitals. Methods: Review of a unit database was undertaken of patients undergoing surgery for PC between January 2006 and May 2014. Results: 394 patients were studied. Although both the median travel distance for patients from the five hospitals (10.8, 86, 78.8, 54.7 and 89.2 km) (p < 0.05), and the annual operation rate for PC (2.99, 3.29, 2.13, 3.32 and 3.07 per 100,000) (p = 0.044) were significantly different, no correlation was noted between patient travel distance and population operation rate at each hospital. No difference was noted between patients from each hospital in terms of resection completion rate or pathological stage of the resected tumours. The median survival after diagnosis for patients referred from different hospitals ranged from 1.2 to 1.7 years and regression analysis revealed that increased travel distance to the regional centre was associated with a small survival advantage. Conclusion: Although variation in the provision and outcome of surgery for PC between regional hospitals is noted, this is not adversely affected by geographical isolation from the regional centre

    Bayesian modelling of ultra high-frequency financial data

    No full text
    The availability of ultra high-frequency (UHF) data on transactions has revolutionised data processing and statistical modelling techniques in finance. The unique characteristics of such data, e.g. discrete structure of price change, unequally spaced time intervals and multiple transactions have introduced new theoretical and computational challenges. In this study, we develop a Bayesian framework for modelling integer-valued variables to capture the fundamental properties of price change. We propose the application of the zero inflated Poisson difference (ZPD) distribution for modelling UHF data and assess the effect of covariates on the behaviour of price change. For this purpose, we present two modelling schemes; the first one is based on the analysis of the data after the market closes for the day and is referred to as off-line data processing. In this case, the Bayesian interpretation and analysis are undertaken using Markov chain Monte Carlo methods. The second modelling scheme introduces the dynamic ZPD model which is implemented through Sequential Monte Carlo methods (also known as particle filters). This procedure enables us to update our inference from data as new transactions take place and is known as online data processing. We apply our models to a set of FTSE100 index changes. Based on the probability integral transform, modified for the case of integer-valued random variables, we show that our models are capable of explaining well the observed distribution of price change. We then apply the deviance information criterion and introduce its sequential version for the purpose of model comparison for off-line and online modelling, respectively. Moreover, in order to add more flexibility to the tails of the ZPD distribution, we introduce the zero inflated generalised Poisson difference distribution and outline its possible application for modelling UHF data.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Raw Data and Noise in Spectrophotometry

    No full text
    Spectrophotometers are ubiquitous in chemical and biological science; however, their precision limits are under-appreciated. Rules-of-thumb and IUPAC referenced guidance restricting the range of absorbance to minimize uncertainty are based on historically important instruments which are no longer as widely used. Instrumentation advances over the last half-century have changed the nature of spectrophotometric “raw” data while enabling opportunities to better evaluate their performance. Current IUPAC refenced guidance indicates that absorbance be limited to between 0.1 and 1.0 a.u. and that optimal performance (minimum relative standard deviation (RSD)) will be obtained at 0.43 a.u. or 0.86 a.u. depending on the type of limiting noise. We characterised noise in UV-Vis spectrophotometers across the spectrum and found wavelength-dependent variation in optimal performance. Optimal RSD approached neither extreme with minima varying depending on wavelength. We could find no evidence justifying guidance restricting absorbance to between 0.1 and 1.0 a.u. Measured RSD and light intensity are more important than absorbance values for assuring good quality measurements. Recovering light intensity estimates is a difficult inverse problem when I and I0 are not available, and the modern commercial instruments tested did not provide these. Based on this work, we recommend IUPAC modernise the references in its Gold Book with up-to-date articles and press instrument makers to provide access to instrument raw data

    Reconstructing historical urban landscapes from KH-9 HEXAGON mapping camera system imagery : an example of Hangzhou City

    No full text
    Declassified images from the Keyhole (KH)-9 HEXAGON mapping camera system (MCS) offer fine-scale details of urban regions. However, these images have seldom been utilized in urban research due to challenges in labelling (collecting training samples), having only a single panchromatic band and classification. To tackle these limitations, this paper focuses on developing a multi-stage reconstructed historical fine-scale urban landscape (RHFUL) pipeline for KH-9 HEXAGON MCS. The proposed pipeline first integrates internalized parameters, hierarchical object-based image analysis properties and class variability to synthesize new features, abbreviated to IHC. Second, the pipeline uses a weak semi-automated supervised labelling (WSSL) approach to acquire training samples. Finally, the training samples and generated features are subjected to the SegNet deep learning architecture. The performance of each step was assessed against corresponding state-of-the-art benchmark approaches for each of synthesizing features, labelling and classification. In the proposed RHFUL pipeline, the proposed IHC provided the most salient information for urban classification, WSSL labelled urban features more accurately, and the SegNet architecture classified more accurately the urban features relative to the benchmarks. Considering the potential advantages, but also limitations of KH-9 HEXAGON MCS images, further research should be undertaken, particularly drawing on the current advances in pattern recognition techniques for contemporary digital satellite sensors
    corecore