552 research outputs found

    Signal processing based method for solving inverse scattering problems

    Get PDF
    The problem of reconstructing an image of the permittivity distribution inside a penetrable and strongly scattering object from a finite number of noisy scattered field measurements has always been very challenging because it is ill-posed in nature. Several techniques have been developed which are either computationally very expensive or typically require the object to be weakly scattering. I have developed here a non-linear signal processing method, which will recover images for both strong scatterers and weak scatterers. This nonlinear or cepstral filtering method requires that the scattered field data is first preprocessed to generate a minimum phase function in the object domain. In 2-D or higher dimensional problems, I describe the conditions for minimum phase and demonstrate how an artificial reference wave can be numerically combined with measured complex scattering data in order to enforce this condition, by satisfying Rouche‘s theorem. In the cepstral domain one can filter the frequencies associated with an object from those of the scattered field. After filtering, the next step is to inverse Fourier transform these data and exponentiate to recover the image of the object under test. In addition I also investigate the scattered field sampling requirements for the inverse scattering problem. The proposed inversion technique is applied to the measured experimental data to recover both shape and relative permittivity of unknown objects. The obtained results confirm the effectiveness of this algorithm and show that one can identify optimal parameters for the reference wave and an optimal procedure that results in good reconstructions of a penetrable, strongly scattering permittivity distribution

    Harnessing the Power of Distributed Computing: Advancements in Scientific Applications, Homomorphic Encryption, and Federated Learning Security

    Get PDF
    Data explosion poses lot of challenges to the state-of-the art systems, applications, and methodologies. It has been reported that 181 zettabytes of data are expected to be generated in 2025 which is over 150\% increase compared to the data that is expected to be generated in 2023. However, while system manufacturers are consistently developing devices with larger storage spaces and providing alternative storage capacities in the cloud at affordable rates, another key challenge experienced is how to effectively process the fraction of large scale of stored data in time-critical conventional systems. One transformative paradigm revolutionizing the processing and management of these large data is distributed computing whose application requires deep understanding. This dissertation focuses on exploring the potential impact of applying efficient distributed computing concepts to long existing challenges or issues in (i) a widely data-intensive scientific application (ii) applying homomorphic encryption to data intensive workloads found in outsourced databases and (iii) security of tokenized incentive mechanism for Federated learning (FL) systems.The first part of the dissertation tackles the Microelectrode arrays (MEAs) parameterization problem from an orthogonal viewpoint enlightened by algebraic topology, which allows us to algebraically parametrize MEAs whose structure and intrinsic parallelism are hard to identify otherwise. We implement a new paradigm, namely Parma, to demonstrate the effectiveness of the proposed approach and report how it outperforms the state-of-the-practice in time, scalability, and memory usage.The second part discusses our work on introducing the concept of parallel caching of secure aggregation to mitigate the performance overhead incurred by the HE module in outsourced databases. The key idea of this optimization approach is caching selected radix-ciphertexts in parallel without violating existing security guarantees of the primitive/base HE scheme. A new radix HE algorithm was designed and applied to both batch and incremental HE schemes, and experiments carried out on six workloads show that the proposed caching boost state-of-the-art HE schemes by high orders of magnitudes.In the third part, I will discuss our work on leveraging the security benefit of blockchains to enhance or protect the fairness and reliability of tokenized incentive mechanism for FL systems. We designed a blockchain-based auditing protocol to mitigate Gaussian attacks and carried out experiments with multiple FL aggregation algorithms, popular data sets and a variety of scales to validate its effectiveness

    Fusing simulated GEDI, ICESat-2 and NISAR data for regional aboveground biomass mapping

    Get PDF
    Accurate mapping of forest aboveground biomass (AGB) is critical for better understanding the role of forests in the global carbon cycle. NASA's current GEDI and ICESat-2 missions as well as the upcoming NISAR mission will collect synergistic data with different coverage and sensitivity to AGB. In this study, we present a multi-sensor data fusion approach leveraging the strength of each mission to produce wall-to-wall AGB maps that are more accurate and spatially comprehensive than what is achievable with any one sensor alone. Specifically, we calibrate a regional L-band radar AGB model using the sparse, simulated spaceborne lidar AGB estimates. We assess our data fusion framework using simulations of GEDI, ICESat-2 and NISAR data from airborne laser scanning (ALS) and UAVSAR data acquired over the temperate high AGB forest and complex terrain in Sonoma County, California, USA. For ICESat-2 and GEDI missions, we simulate two years of data coverage and AGB at footprint level are estimated using realistic AGB models. We compare the performance of our fusion framework when different combinations of the sparse simulated GEDI and ICEsat-2 AGB estimates are used to calibrate our regional L-band AGB models. In addition, we test our framework at Sonoma using (a) 1-ha square grid cells and (b) similarly sized irregularly shaped objects. We demonstrate that the estimated mean AGB across Sonoma is more accurately estimated using our fusion framework than using GEDI or ICESat-2 mission data alone, either with a regular grid or with irregular segments as mapping units. This research highlights methodological opportunities for fusing new and upcoming active remote sensing data streams toward improved AGB mapping through data fusion.</p
    • …
    corecore