5,850 research outputs found

    A review on different techniques used to combat the non-IID and heterogeneous nature of data in FL

    Full text link
    Federated Learning (FL) is a machine-learning approach enabling collaborative model training across multiple decentralized edge devices that hold local data samples, all without exchanging these samples. This collaborative process occurs under the supervision of a central server orchestrating the training or via a peer-to-peer network. The significance of FL is particularly pronounced in industries such as healthcare and finance, where data privacy holds paramount importance. However, training a model under the Federated learning setting brings forth several challenges, with one of the most prominent being the heterogeneity of data distribution among the edge devices. The data is typically non-independently and non-identically distributed (non-IID), thereby presenting challenges to model convergence. This report delves into the issues arising from non-IID and heterogeneous data and explores current algorithms designed to address these challenges

    Preprocessing Techniques in Character Recognition

    Get PDF

    Study Of Gaussian & Impulsive Noise Suppression Schemes In Images

    Get PDF
    Noise is introduced into images usually while transferring and acquiring them.The main type of noise added while image acquisition is called Gaussian noise while Impulsive noise is generally introduced while transmitting image data over an unsecure communication channel , while it can also be added by acquiring. Gaussian noise is a set of values taken from a zero mean Gaussian distribution which are added to each pixel value. Impulsive noise involves changing a part of the pixel values with random ones. Various techniques are employed for the removal of these types of noise based on the properties of their respective noise models. Impulse Noise removal algorithms popularly use ordered statistics based ¯lters. The ¯rst one is an adaptive ¯lter using center-weighted median. In this method, the di®erence of the center weighted mean of a neighborhood with the central pixel under consideration is compared with a set of thresholds. Another method which takes into account the presence of the noise free pixels has been implemented.It convolutes the median of each neighborhood with a set of convolution kernels which are oriented according to all possible con¯gurations of edges that contain the central pixel,if it lies on an edge. A third method which deals with the detection of noisy pixels on the binary slices of an image is implemented. It is based on threshold Boolean ¯ltering. The ¯lter inverts the value of the central pixel if the number of pixels with values opposite to it is more than the threshold. The fourth method has an e±cient double derivative detector, which gives a de- cision based on the value of the double derivative. The substitution is done with the average gray scale value of the neighborhood. Gaussian Noise removal algorithms ideally should smooth the distinct parts of the image without blurring the edges.A universal noise removing scheme is implemented which weighs each pixel with respect to its neighborhood and deals with Gaussian and impulse noise pixels di®erently based on parameter values for spatial, radiometric and impulsive weight of the central pixel. The aforementioned techniques are implemented and their results are compared subjectively as well as objectively
    corecore