1,826 research outputs found

    Dispersive Fourier Transformation for Versatile Microwave Photonics Applications

    Get PDF
    Abstract: Dispersive Fourier transformation (DFT) maps the broadband spectrum of an ultrashort optical pulse into a time stretched waveform with its intensity profile mirroring the spectrum using chromatic dispersion. Owing to its capability of continuous pulse-by-pulse spectroscopic measurement and manipulation, DFT has become an emerging technique for ultrafast signal generation and processing, and high-throughput real-time measurements, where the speed of traditional optical instruments falls short. In this paper, the principle and implementation methods of DFT are first introduced and the recent development in employing DFT technique for widespread microwave photonics applications are presented, with emphasis on real-time spectroscopy, microwave arbitrary waveform generation, and microwave spectrum sensing. Finally, possible future research directions for DFT-based microwave photonics techniques are discussed as well

    Comprehensive survey on quality of service provisioning approaches in cognitive radio networks : part one

    Get PDF
    Much interest in Cognitive Radio Networks (CRNs) has been raised recently by enabling unlicensed (secondary) users to utilize the unused portions of the licensed spectrum. CRN utilization of residual spectrum bands of Primary (licensed) Networks (PNs) must avoid harmful interference to the users of PNs and other overlapping CRNs. The coexisting of CRNs depends on four components: Spectrum Sensing, Spectrum Decision, Spectrum Sharing, and Spectrum Mobility. Various approaches have been proposed to improve Quality of Service (QoS) provisioning in CRNs within fluctuating spectrum availability. However, CRN implementation poses many technical challenges due to a sporadic usage of licensed spectrum bands, which will be increased after deploying CRNs. Unlike traditional surveys of CRNs, this paper addresses QoS provisioning approaches of CRN components and provides an up-to-date comprehensive survey of the recent improvement in these approaches. Major features of the open research challenges of each approach are investigated. Due to the extensive nature of the topic, this paper is the first part of the survey which investigates QoS approaches on spectrum sensing and decision components respectively. The remaining approaches of spectrum sharing and mobility components will be investigated in the next part

    Computational Spectral Imaging: A Contemporary Overview

    Full text link
    Spectral imaging collects and processes information along spatial and spectral coordinates quantified in discrete voxels, which can be treated as a 3D spectral data cube. The spectral images (SIs) allow identifying objects, crops, and materials in the scene through their spectral behavior. Since most spectral optical systems can only employ 1D or maximum 2D sensors, it is challenging to directly acquire the 3D information from available commercial sensors. As an alternative, computational spectral imaging (CSI) has emerged as a sensing tool where the 3D data can be obtained using 2D encoded projections. Then, a computational recovery process must be employed to retrieve the SI. CSI enables the development of snapshot optical systems that reduce acquisition time and provide low computational storage costs compared to conventional scanning systems. Recent advances in deep learning (DL) have allowed the design of data-driven CSI to improve the SI reconstruction or, even more, perform high-level tasks such as classification, unmixing, or anomaly detection directly from 2D encoded projections. This work summarises the advances in CSI, starting with SI and its relevance; continuing with the most relevant compressive spectral optical systems. Then, CSI with DL will be introduced, and the recent advances in combining the physical optical design with computational DL algorithms to solve high-level tasks

    Algorithms design for improving homecare using Electrocardiogram (ECG) signals and Internet of Things (IoT)

    Get PDF
    Due to the fast growing of population, a lot of hospitals get crowded from the huge amount of patients visits. Moreover, during COVID-19 a lot of patients prefer staying at home to minimize the spread of the virus. The need for providing care to patients at home is essential. Internet of Things (IoT) is widely known and used by different fields. IoT based homecare will help in reducing the burden upon hospitals. IoT with homecare bring up several benefits such as minimizing human exertions, economical savings and improved efficiency and effectiveness. One of the important requirement on homecare system is the accuracy because those systems are dealing with human health which is sensitive and need high amount of accuracy. Moreover, those systems deal with huge amount of data due to the continues sensing that need to be processed well to provide fast response regarding the diagnosis with minimum cost requirements. Heart is one of the most important organ in the human body that requires high level of caring. Monitoring heart status can diagnose disease from the early stage and find the best medication plan by health experts. Continues monitoring and diagnosis of heart could exhaust caregivers efforts. Having an IoT heart monitoring model at home is the solution to this problem. Electrocardiogram (ECG) signals are used to track heart condition using waves and peaks. Accurate and efficient IoT ECG monitoring at home can detect heart diseases and save human lives. As a consequence, an IoT ECG homecare monitoring model is designed in this thesis for detecting Cardiac Arrhythmia and diagnosing heart diseases. Two databases of ECG signals are used; one online which is old and limited, and another huge, unique and special from real patients in hospital. The raw ECG signal for each patient is passed through the implemented Low Pass filter and Savitzky Golay filter signal processing techniques to remove the noise and any external interference. The clear signal in this model is passed through feature extraction stage to extract number of features based on some metrics and medical information along with feature extraction algorithm to find peaks and waves. Those features are saved in the local database to apply classification on them. For the diagnosis purpose a classification stage is made using three classification ways; threshold values, machine learning and deep learning to increase the accuracy. Threshold values classification technique worked based on medical values and boarder lines. In case any feature goes above or beyond these ranges, a warning message appeared with expected heart disease. The second type of classification is by using machine learning to minimize the human efforts. A Support Vector Machine (SVM) algorithm is proposed by running the algorithm on the features extracted from both databases. The classification accuracy for online and hospital databases was 91.67% and 94% respectively. Due to the non-linearity of the decision boundary, a third way of classification using deep learning is presented. A full Multilayer Perceptron (MLP) Neural Network is implemented to improve the accuracy and reduce the errors. The number of errors reduced to 0.019 and 0.006 using online and hospital databases. While using hospital database which is huge, there is a need for a technique to reduce the amount of data. Furthermore, a novel adaptive amplitude threshold compression algorithm is proposed. This algorithm is able to make diagnosis of heart disease from the reduced size using compressed ECG signals with high level of accuracy and low cost. The extracted features from compressed and original are similar with only slight differences of 1%, 2% and 3% with no effects on machine learning and deep learning classification accuracy without the need for any reconstructions. The throughput is improved by 43% with reduced storage space of 57% when using data compression. Moreover, to achieve fast response, the amount of data should be reduced further to provide fast data transmission. A compressive sensing based cardiac homecare system is presented. It gives the channel between sender and receiver the ability to carry small amount of data. Experiment results reveal that the proposed models are more accurate in the classification of Cardiac Arrhythmia and in the diagnosis of heart diseases. The proposed models ensure fast diagnosis and minimum cost requirements. Based on the experiments on classification accuracy, number of errors and false alarms, the dictionary of the compressive sensing selected to be 900. As a result, this thesis provided three different scenarios that achieved IoT homecare Cardiac monitoring to assist in further research for designing homecare Cardiac monitoring systems. The experiment results reveal that those scenarios produced better results with high level of accuracy in addition to minimizing data and cost requirements

    Robust and Efficient Inference of Scene and Object Motion in Multi-Camera Systems

    Get PDF
    Multi-camera systems have the ability to overcome some of the fundamental limitations of single camera based systems. Having multiple view points of a scene goes a long way in limiting the influence of field of view, occlusion, blur and poor resolution of an individual camera. This dissertation addresses robust and efficient inference of object motion and scene in multi-camera and multi-sensor systems. The first part of the dissertation discusses the role of constraints introduced by projective imaging towards robust inference of multi-camera/sensor based object motion. We discuss the role of the homography and epipolar constraints for fusing object motion perceived by individual cameras. For planar scenes, the homography constraints provide a natural mechanism for data association. For scenes that are not planar, the epipolar constraint provides a weaker multi-view relationship. We use the epipolar constraint for tracking in multi-camera and multi-sensor networks. In particular, we show that the epipolar constraint reduces the dimensionality of the state space of the problem by introducing a ``shared'' state space for the joint tracking problem. This allows for robust tracking even when one of the sensors fail due to poor SNR or occlusion. The second part of the dissertation deals with challenges in the computational aspects of tracking algorithms that are common to such systems. Much of the inference in the multi-camera and multi-sensor networks deal with complex non-linear models corrupted with non-Gaussian noise. Particle filters provide approximate Bayesian inference in such settings. We analyze the computational drawbacks of traditional particle filtering algorithms, and present a method for implementing the particle filter using the Independent Metropolis Hastings sampler, that is highly amenable to pipelined implementations and parallelization. We analyze the implementations of the proposed algorithm, and in particular concentrate on implementations that have minimum processing times. The last part of the dissertation deals with the efficient sensing paradigm of compressing sensing (CS) applied to signals in imaging, such as natural images and reflectance fields. We propose a hybrid signal model on the assumption that most real-world signals exhibit subspace compressibility as well as sparse representations. We show that several real-world visual signals such as images, reflectance fields, videos etc., are better approximated by this hybrid of two models. We derive optimal hybrid linear projections of the signal and show that theoretical guarantees and algorithms designed for CS can be easily extended to hybrid subspace-compressive sensing. Such methods reduce the amount of information sensed by a camera, and help in reducing the so called data deluge problem in large multi-camera systems

    Banco de testes para monitoramento sub-Nyquist de espectro de banda larga

    Get PDF
    Radioelectric spectrum management is a concern for today’s world, mainly due to the misuse that has been given to this resource through the years, especially on the UHF band. To address this problem, a testbed for sub-Nyquist Wideband Spectrum Monitoring was built, that includes a web interface to remotely measure occupancy of the UHF band. To achieve the above, an RF interface that allows tuning UHF frequencies with an instantaneous bandwidth of 95 MHz was built. Afterwards, a Random Demodulator was connected, and then an embedded system performed sub--Nyquist sampling and spectrum recovery. The embedded system connected to an information system that serves a web page, through which remote users can perform UHF band monitoring. Experimental results showed that spectrum sensing can be achieved by using different algorithms on certain sparse spectra. In addition, the aforementioned web interface allowed simultaneous user connections, in order to perform independent measurements by sharing a hardware subsystem.La gestión del espectro radioeléctrico es una preocupación en la actualidad, hecho derivado, ante todo, del mal uso que se ha dado a este recurso a través de los años, especialmente en la banda de UHF. Para afrontar este problema, se construyó un banco de pruebas para la supervisión del espectro de banda ancha a través de muestreo sub-Nyquist, el cual incluye una interfaz web para medir de forma remota la ocupación de la banda UHF. Para lograr esto, se construyó una interfaz RF que permitiría sintonizar frecuencias UHF con un ancho de banda instantáneo de 95 MHz. Después, se conectó un demodulador aleatorio; y luego, un sistema embebido realizaría el muestreo sub-Nyquist y la recuperación del espectro. Este se conectaría, a su turno, con un sistema de información que sirve una página web, a través de la cual los usuarios remotos pueden realizar la supervisión de la banda de UHF. Los resultados muestran que la detección del espectro se puede lograr mediante diferentes algoritmos en ciertos espectros dispersos. Además, la interfaz web permitió que existiesen conexiones de usuario simultáneas, de tal manera que se realizaran mediciones independientes compartiendo el subsistema de hardware.O gerenciamento do espectro radioelétrico é uma preocupação na atualidade, fato derivado, inicialmente, do mau uso que se tem dado a esse recurso através dos anos, especialmente na banda de UHF. Para enfrentar esse problema, construiu-se um banco de testes para a supervisão do espectro de banda larga por meio de amostragem sub-Nyquist, a qual inclui uma interface web para medir de forma remota a ocupação da banda UHF. Para isso, construiu-se uma interface RF que permitiria sintonizar frequências UHF com uma largura de banda instantânea de 95 MHz. Em seguida, ligou-se um demodulador aleatório; logo, um sistema embebido realizaria a amostragem sub-Nyquist e a recuperação do espectro. Este se ligaria, por sua vez, com um sistema de informação que serve um site, através do qual os usuários remotos podem realizar a supervisão da banda de UHF. Os resultados mostram que a detecção do espectro pode ser conseguida mediante diferentes algoritmos em certos espectros dispersos. Além disso, a interface web permitiu que existissem conexões de usuário simultâneas, de tal maneira que se realizassem medidas independentes compartilhando o subsistema de hardware.&nbsp
    • …
    corecore