283 research outputs found

    Hirschman optimal transform least mean square adaptive filters.

    Get PDF
    Abstract not available

    Towards Computer Aided Management of Kidney Disease

    Get PDF
    Autosomal dominant polycystic kidney disease (ADPKD) is the fourth most common cause of kidney transplant worldwide accounting for 7-10% of all cases. Although ADPKD usually progresses over many decades, accurate risk prediction is an important task. Identifying patients with progressive disease is vital to providing new treatments being developed and enable them to enter clinical trials for new therapy. Among other factors, total kidney volume (TKV) is a major biomarker predicting the progression of ADPKD. Consortium for Radiologic Imaging Studies in Polycystic Kidney Disease (CRISP) have shown that TKV is an early, and accurate measure of cystic burden and likely growth rate. It is strongly associated with loss of renal function. While ultrasound (US) has proven as an excellent tool for diagnosing the disease; monitoring short-term changes using ultrasound has been shown to not be accurate. This is attributed to high operator variability and reproducibility as compared to tomographic modalities such as CT and MR (Gold standard). Ultrasound has emerged as one of the standout modalities for intra-procedural imaging and with methods for spatial localization has afforded us the ability to track 2D ultrasound in the physical space in which it is being used. In addition to this, the vast amount of recorded tomographic data can be used to generate statistical shape models that allow us to extract clinical value from archived image sets. Renal volumetry is of great interest in the management of chronic kidney diseases (CKD). In this work, we have implemented a tracked ultrasound system and developed a statistical shape model of the kidney. We utilize the tracked ultrasound to acquire a stack of slices that are able to capture the region of interest, in our case kidney phantoms, and reconstruct 3D volume from spatially localized 2D slices. Approximate shape data is then extracted from this 3D volume using manual segmentation of the organ and a shape model is fit to this data. This generates an instance from the shape model that best represents the scanned phantom and volume calculation is done on this instance. We observe that we can calculate the volume to within 10% error in estimation when compared to the gold standard volume of the phantom

    Online Novelty Detection System: One-Class Classification of Systemic Operation

    Get PDF
    Presented is an Online Novelty Detection System (ONDS) that uses Gaussian Mixture Models (GMMs) and one-class classification techniques to identify novel information from multivariate times-series data. Multiple data preprocessing methods are explored and features vectors formed from frequency components obtained by the Fast Fourier Transform (FFT) and Welch\u27s method of estimating Power Spectral Density (PSD). The number of features are reduced by using bandpower schemes and Principal Component Analysis (PCA). The Expectation Maximization (EM) algorithm is used to learn parameters for GMMs on feature vectors collected from only normal operational conditions. One-class classification is achieved by thresholding likelihood values relative to statistical limits. The ONDS is applied to two different applications from different application domains. The first application uses the ONDS to evaluate systemic health of Radio Frequency (RF) power generators. Four different models of RF power generators and over 400 unique units are tested, and the average robust true positive rate of 94.76% is achieved and the best specificity reported as 86.56%. The second application uses the ONDS to identify novel events from equine motion data and assess equine distress. The ONDS correctly identifies target behaviors as novel events with 97.5% accuracy. Algorithm implementation for both methods is evaluated within embedded systems and demonstrates execution times appropriate for online use

    An Optimization of Thermodynamic Efficiency vs. Capacity for Communications Systems

    Get PDF
    This work provides a fundamental view of the mechanisms which affect the power efficiency of communications processes along with a method for efficiency enhancement. Shannon\u27s work is the definitive source for analyzing information capacity of a communications system but his formulation does not predict an efficiency relationship suitable for calculating the power consumption of a system, particularly for practical signals which may only approach the capacity limit. This work leverages Shannon\u27s while providing additional insight through physical models which enable the calculation and improvement of efficiency for the encoding of signals. The proliferation of Mobile Communications platforms is challenging capacity of networks largely because of the ever increasing data rate at each node. This places significant power management demands on personal computing devices as well as cellular and WLAN terminals. The increased data throughput translates to shorter meantime between battery charging cycles and increased thermal footprint. Solutions are developed herein to counter this trend. Hardware was constructed to measure the efficiency of a prototypical Gaussian signal prior to efficiency enhancement. After an optimization was performed, the efficiency of the encoding apparatus increased from 3.125% to greater than 86% for a manageable investment of resources. Likewise several telecommunications standards based waveforms were also tested on the same hardware. The results reveal that the developed physical theories extrapolate in a very accurate manner to an electronics application, predicting the efficiency of single ended and differential encoding circuits before and after optimization

    Design of large polyphase filters in the Quadratic Residue Number System

    Full text link

    Temperature aware power optimization for multicore floating-point units

    Full text link

    Pricing Offshore Services: Evidence from the Paradise Papers

    Get PDF
    The Paradise Papers represent one of the largest public data leaks comprising 13.4 million con_dential electronic documents. A dominant theory presented by Neal (2014) and Gri_th, Miller and O'Connell (2014) concerns the use of these offshore services in the relocation of intellectual property for the purposes of compliance, privacy and tax avoidance. Building on the work of Fernandez (2011), Billio et al. (2016) and Kou, Peng and Zhong (2018) in Spatial Arbitrage Pricing Theory (s-APT) and work by Kelly, Lustig and Van Nieuwerburgh (2013), Ahern (2013), Herskovic (2018) and Proch_azkov_a (2020) on the impacts of network centrality on _rm pricing, we use market response, discussed in O'Donovan, Wagner and Zeume (2019), to characterise the role of offshore services in securities pricing and the transmission of price risk. Following the spatial modelling selection procedure proposed in Mur and Angulo (2009), we identify Pro_t Margin and Price-to-Research as firm-characteristics describing market response over this event window. Using a social network lag explanatory model, we provide evidence for social exogenous effects, as described in Manski (1993), which may characterise the licensing or exchange of intellectual property between connected firms found in the Paradise Papers. From these findings, we hope to provide insight to policymakers on the role and impact of offshore services on securities pricing

    Frequency weighted optimal Hankel-norm approximation of scalar linear systems

    No full text
    This thesis addresses the problem of model reduction for scalar linear time-invariant systems via the use of the optimal Hankel-norm approximation problem. Frequency weighting is combined with optimality in the Hankel norm to obtain a frequency shaped approximation to a given linear system. This is accomplished by the solution of a modified optimal Hankel-norm approximation problem. Also presented is an error analysis for the frequency shaped approximation

    The Affine Uncertainty Principle, Associated Frames and Applications in Signal Processing

    Get PDF
    Uncertainty relations play a prominent role in signal processing, stating that a signal can not be simultaneously concentrated in the two related domains of the corresponding phase space. In particular, a new uncertainty principle for the affine group, which is directly related to the wavelet transform has lead to a new minimizing waveform. In this thesis, a frame construction is proposed which leads to approximately tight frames based on this minimizing waveform. Frame properties such as the diagonality of the frame operator as well as lower and upper frame bounds are analyzed. Additionally, three applications of such frame constructions are introduced: inpainting of missing audio data, detection of neuronal spikes in extracellular recorded data and peak detection in MALDI imaging data

    Measuring aberrations in lithographic projection systems with phase wheel targets

    Get PDF
    A significant factor in the degradation of nanolithographic image fidelity is optical wavefront aberration. Aerial image sensitivity to aberrations is currently much greater than in earlier lithographic technologies, a consequence of increased resolution requirements. Optical wavefront tolerances are dictated by the dimensional tolerances of features printed, which require lens designs with a high degree of aberration correction. In order to increase lithographic resolution, lens numerical aperture (NA) must continue to increase and imaging wavelength must decrease. Not only do aberration magnitudes scale inversely with wavelength, but high-order aberrations increase at a rate proportional to NA2 or greater, as do aberrations across the image field. Achieving lithographic-quality diffraction limited performance from an optical system, where the relatively low image contrast is further reduced by aberrations, requires the development of highly accurate in situ aberration measurement. In this work, phase wheel targets are used to generate an optical image, which can then be used to both describe and monitor aberrations in lithographic projection systems. The use of lithographic images is critical in this approach, since it ensures that optical system measurements are obtained during the system\u27s standard operation. A mathematical framework is developed that translates image errors into the Zernike polynomial representation, commonly used in the description of optical aberrations. The wavefront is decomposed into a set of orthogonal basis functions, and coefficients for the set are estimated from image-based measurements. A solution is deduced from multiple image measurements by using a combination of different image sets. Correlations between aberrations and phase wheel image characteristics are modeled based on physical simulation and statistical analysis. The approach uses a well-developed rigorous simulation tool to model significant aspects of lithography processes to assess how aberrations affect the final image. The aberration impact on resulting image shapes is then examined and approximations identified so the aberration computation can be made into a fast compact model form. Wavefront reconstruction examples are presented together with corresponding numerical results. The detailed analysis is given along with empirical measurements and a discussion of measurement capabilities. Finally, the impact of systematic errors in exposure tool parameters is measureable from empirical data and can be removed in the calibration stage of wavefront analysis
    • …
    corecore