364 research outputs found

    Information theoretic approach for assessing image fidelity in photon-counting arrays

    Get PDF
    The method of photon-counting integral imaging has been introduced recently for three-dimensional object sensing, visualization, recognition and classification of scenes under photon-starved conditions. This paper presents an information-theoretic model for the photon-counting imaging (PCI) method, thereby providing a rigorous foundation for the merits of PCI in terms of image fidelity. This, in turn, can facilitate our understanding of the demonstrated success of photon-counting integral imaging in compressive imaging and classification. The mutual information between the source and photon-counted images is derived in a Markov random field setting and normalized by the source-image’s entropy, yielding a fidelity metric that is between zero and unity, which respectively corresponds to complete loss of information and full preservation of information. Calculations suggest that the PCI fidelity metric increases with spatial correlation in source image, from which we infer that the PCI method is particularly effective for source images with high spatial correlation; the metric also increases with the reduction in photon-number uncertainty. As an application to the theory, an image-classification problem is considered showing a congruous relationship between the fidelity metric and classifier’s performance

    Dynamic Cloud Network Control under Reconfiguration Delay and Cost

    Full text link
    Network virtualization and programmability allow operators to deploy a wide range of services over a common physical infrastructure and elastically allocate cloud and network resources according to changing requirements. While the elastic reconfiguration of virtual resources enables dynamically scaling capacity in order to support service demands with minimal operational cost, reconfiguration operations make resources unavailable during a given time period and may incur additional cost. In this paper, we address the dynamic cloud network control problem under non-negligible reconfiguration delay and cost. We show that while the capacity region remains unchanged regardless of the reconfiguration delay/cost values, a reconfiguration-agnostic policy may fail to guarantee throughput-optimality and minimum cost under nonzero reconfiguration delay/cost. We then present an adaptive dynamic cloud network control policy that allows network nodes to make local flow scheduling and resource allocation decisions while controlling the frequency of reconfiguration in order to support any input rate in the capacity region and achieve arbitrarily close to minimum cost for any finite reconfiguration delay/cost values.Comment: 15 pages, 7 figure

    A New Technique in saving Fingerprint with low volume by using Chaos Game and Fractal Theory

    Get PDF
    Fingerprint is one of the simplest and most reliable biometric features of human for identification. In this study by using fractal theory and by the assistance of Chaos Game a new fractal is made from fingerprint. While making the new fractal by using Chaos Game mechanism some parameters, which can be used in identification process, can be deciphered. For this purpose, a fractal is made for each fingerprint, we save 10 parameters for every fingerprint, which have necessary information for identity, as said before. So we save 10 decimal parameters with 0.02 accuracy instead of saving the picture of a fingerprint or some parts of it. Now we improve the great volume of fingerprint pictures by using this model which employs fractal for knowing the personality

    Ransomware Detection Using Federated Learning with Imbalanced Datasets

    Full text link
    Ransomware is a type of malware which encrypts user data and extorts payments in return for the decryption keys. This cyberthreat is one of the most serious challenges facing organizations today and has already caused immense financial damage. As a result, many researchers have been developing techniques to counter ransomware. Recently, the federated learning (FL) approach has also been applied for ransomware analysis, allowing corporations to achieve scalable, effective detection and attribution without having to share their private data. However, in reality there is much variation in the quantity and composition of ransomware data collected across multiple FL client sites/regions. This imbalance will inevitably degrade the effectiveness of any defense mechanisms. To address this concern, a modified FL scheme is proposed using a weighted cross-entropy loss function approach to mitigate dataset imbalance. A detailed performance evaluation study is then presented for the case of static analysis using the latest Windows-based ransomware families. The findings confirm improved ML classifier performance for a highly imbalanced dataset.Comment: 6 pages, 4 figures, 3 table
    corecore