775 research outputs found

    A Suboptimal Approach to Antenna Design Problem With Kernel Regression

    Get PDF
    This paper proposes a novel iterative algorithm based on a Kernel regression as a suboptimal approach to reliable and efficient antenna optimization. In our approach, the complex and non-linear cost surface calculated from antenna characteristics is fitted into a simple linear model using Kernels, and an argument that minimizes this Kernel regression model is used as a new input to calculate its cost using numerical simulations. This process is repeated by updating coefficients of the Kernel regression model with new entries until meeting the stopping criteria. At every iteration, existing inputs are partitioned into a limited number of clusters to reduce the computational time and resources and to prevent unexpected over-weighted situations. The proposed approach is validated for the Rastrigins function as well as a real engineering problem using an antipodal Vivaldi antenna in comparison with a genetic algorithm. Furthermore, we explore the most appropriate Kernel that minimizes the least-square error when fitting the antenna cost surface. The results demonstrate that the proposed process is suitable to be used in antenna design problems as a reliable approach with a fast convergence time

    Thirty Years of Machine Learning: The Road to Pareto-Optimal Wireless Networks

    Full text link
    Future wireless networks have a substantial potential in terms of supporting a broad range of complex compelling applications both in military and civilian fields, where the users are able to enjoy high-rate, low-latency, low-cost and reliable information services. Achieving this ambitious goal requires new radio techniques for adaptive learning and intelligent decision making because of the complex heterogeneous nature of the network structures and wireless services. Machine learning (ML) algorithms have great success in supporting big data analytics, efficient parameter estimation and interactive decision making. Hence, in this article, we review the thirty-year history of ML by elaborating on supervised learning, unsupervised learning, reinforcement learning and deep learning. Furthermore, we investigate their employment in the compelling applications of wireless networks, including heterogeneous networks (HetNets), cognitive radios (CR), Internet of things (IoT), machine to machine networks (M2M), and so on. This article aims for assisting the readers in clarifying the motivation and methodology of the various ML algorithms, so as to invoke them for hitherto unexplored services as well as scenarios of future wireless networks.Comment: 46 pages, 22 fig

    Approximate Kernel Orthogonalization for Antenna Array Processing

    Get PDF
    We present a method for kernel antenna array processing using Gaussian kernels as basis functions. The method first identifies the data clusters by using a modified sparse greedy matrix approximation. Then, the algorithm performs model reduction in order to try to reduce the final size of the beamformer. The method is tested with simulations that include two arrays made of two and seven printed half wavelength thick dipoles, in scenarios with 4 and 5 users coming from different angles of arrival. The antenna parameters are simulated for all DOAs, and include the dipole radiation pattern and the mutual coupling effects of the array. The method is compared with other state-of-the-art nonlinear processing methods, to show that the presented algorithm has near optimal capabilities together with a low computational burden.Spanish Governnment under Grant TEC2008-02473IEEE Antennas and Propagation SocietyPublicad

    Digital communication receivers using Gaussian processes for machine learning

    Get PDF
    We propose Gaussian processes (GPs) as a novel nonlinear receiver for digital communication systems. The GPs framework can be used to solve both classification (GPC) and regression (GPR) problems. The minimum mean squared error solution is the expectation of the transmitted symbol given the information at the receiver, which is a nonlinear function of the received symbols for discrete inputs. GPR can be presented as a nonlinear MMSE estimator and thus capable of achieving optimal performance from MMSE viewpoint. Also, the design of digital communication receivers can be viewed as a detection problem, for which GPC is specially suited as it assigns posterior probabilities to each transmitted symbol. We explore the suitability of GPs as nonlinear digital communication receivers. GPs are Bayesian machine learning tools that formulates a likelihood function for its hyperparameters, which can then be set optimally. GPs outperform state-of-the-art nonlinear machine learning approaches that prespecify their hyperparameters or rely on cross validation. We illustrate the advantages of GPs as digital communication receivers for linear and nonlinear channel models for short training sequences and compare them to state-of-the-art nonlinear machine learning tools, such as support vector machines

    Machine Learning Applications in Spacecraft State and Environment Estimation

    Full text link
    There are some problems in spacecraft systems engineering with highly non-linear characteristics and noise where traditional nonlinear estimation techniques fail to yield accurate results. In this thesis, we consider approaching two such problems using kernel methods in machine learning. First, we present a novel formulation and solution to orbit determination of spacecraft and spacecraft groups which can be applied with very weakly observable and highly noisy scenarios. We present a ground station network architecture that can perform orbit determination using Doppler-only observations over the network. Second, we present a machine learning solution to the spacecraft magnetic field interference cancellation problem using distributed magnetometers paving the way for space magnetometry with boom-less CubeSats. We present an approach to orbit determination under very broad conditions that are satisfied for n-body problems. We show that domain generalization and distribution regression techniques can learn to estimate orbits of a group of satellites and identify individual satellites especially with prior understanding of correlations between orbits and provide asymptotic convergence conditions. The approach presented requires only observability of the dynamical system and visibility of the spacecraft and is particularly useful for autonomous spacecraft operations using low-cost ground stations or sensors. With the absence of linear region constraints in the proposed method, we are able to identify orbits that are 800 km apart and reduce orbit uncertainty by 92.5% to under 60 km with noisy Doppler-only measurements. We present an architecture for collaborative orbit determination using networked ground stations. We focus on clusters of satellites deployed in low Earth orbit and measurements of their Doppler-shifted transmissions made by low-gain antenna systems in a software-defined federated ground station network. We develop a network architecture enabling scheduling and tracking with uncertain orbit information. For the proposed network, we also present scheduling and coordinated tracking algorithms for tracking with the purpose of generating measurements for orbit determination. We validate our algorithms and architecture with its application to high fidelity simulations of different networked orbit determination scenarios. We demonstrate how these low-cost ground stations can be used to provide accurate and timely orbital tracking information for large satellite deployments, which is something that remains a challenge for current tracking systems. Last, we present a novel approach and algorithm to the problem of magnetic field interference cancellation of time-varying interference using distributed magnetometers and spacecraft telemetry with particular emphasis on the computational and power requirements of CubeSats. The spacecraft magnetic field interference cancellation problem involves estimation of noise when the number of interfering sources far exceed the number of sensors required to decouple the noise from the signal. The proposed approach models this as a contextual bandit learning problem and the proposed algorithm learns to identify the optimal low-noise combination of distributed magnetometers based on indirect information gained on spacecraft currents through telemetry. Experimental results based on on-orbit spacecraft telemetry shows a 50% reduction in interference compared to the best magnetometer.PHDElectrical Engineering: SystemsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/147688/1/srinag_1.pd
    • 

    corecore