16 research outputs found
Blood image analysis to detect malaria using filtering image edges and classification
Malaria is a most dangerous mosquito borne disease and its infection spread through the infected mosquito. It especially affects the pregnant females and Children less than 5 years age. Malarial species commonly occur in five different shapes, Therefore, to avoid this crucial disease the contemporary researchers have proposed image analysis based solutions to mitigate this death causing disease. In this work, we propose diagnosis algorithm for malaria which is implemented for testing and evaluation in Matlab. We use Filtering and classification along with median filter and SVM classifier. Our proposed method identifies the infected cells from rest of blood images. The Median filtering smoothing technique is used to remove the noise. The feature vectors have been proposed to find out the abnormalities in blood cells. Feature vectors include (Form factor, measurement of roundness, shape, count total number of red cells and parasites). Primary aim of this research is to diagnose malaria by finding out infected cells. However, many techniques and algorithm have been implemented in this field using image processing but accuracy is not up to the point. Our proposed algorithm got more efficient results along with high accuracy as compared to NCC and Fuzzy classifier used by the researchers recently
ANTHROPOMETRIC MEASUREMENT OF PRIMARY SCHOOL GOING CHILDREN IN KARACHI
Background: Children were major part of our society and their health issues were too much, so it’s necessary that child should b healthy so their health assessment is important worldwide the acceptable method for assessment of health status is anthropometry (weight & height). This study was done to assess the anthropometric measurement of primary school going children of Karachi, sindh Pakistan, and evaluate how much students were underweight.
Methods: The study was done in different primary school of Karachi including private semiprivate and government. The study design was cross sectional study. The sample size of our study was 240, including both genders male& female with range between 7-12 years.
Result: The result shows that 34.9% were underweight (below 5th percentile), 63.44% were normal weight (between 5-95th percentile) and only0.8% overweight (above 95th percentile).The result showed male-female ratio was 51.5% boys and 48.5% were underweight. The prevalence of underweight in private sector was only 9.5% but in government it was 45.5%.The height of student also calculated and 26.3% children were below the 5th percentile of height for the age, 62.8% were between 5th to 95th percentile and 10.9% above 95thpercentile.
Conclusion: The prevalence of underweight in primary school going children in Karachi shows the dietary requirements of children are not fulfilled properly and this may lead to many sever pathological conditions, so it is necessary to take positive steps regarding awareness of proper diet, hygiene and growth & development of child.
Keywords: Underweight, Karachi, Anthropometric, School, Heigh
ROLE OF EXERCISES AND DIETARY INTAKE AS AN INTERVENTION IN WEIGHT REDUCTION
Background: Obesity is now so common within the world’s population that it is beginning to replace under nutrition and infectious disease as the most significant contributor to ill health. Exercise plays significance role with dietary control in weight reduction. The aim of our study is to find out the efficacy of weight reduction interventions and role of exercise.
Methods: The study was conducted on 60 participants, subjects or participants were randomly divided. It is an experimental study which was completed in the duration of about six months. The study was based on two groups that are group A (exercise plus dietary) and group B (only dietary) consisting of sixty participants after filling the consent form.
Result: Result shows out of these sixty only fifty eight participants completed the study and were observed for four months. Is has been found comparable weight loss relatively fewer in the dietary group Body weight before (80.1±2.7) after (78.1±2.5)and BMI before (32±2) after (30±2) it shows significant results but less significant than group A body weight which was before(78.6±2.6) after(64.0±2.3) and BMI before (33±1) after (39±1). Therefore the results shows that the group A shows more significant outcomes as compare to the other group included in the study.
Conclusion: The results shows that (exercise plus dietary) more significant outcomes as compare to the other group included in the study
Keywords: Anne Collins, weight reduction, exercises plus dietary, dietary, BMI and body weight
Blood image analysis to detect malaria using filtering image edges and classification
Malaria is a most dangerous mosquito borne disease and its infection spread through the infected mosquito. It especially affects the pregnant females and Children less than 5 years age. Malarial species commonly occur in five different shapes, Therefore, to avoid this crucial disease the contemporary researchers have proposed image analysis based solutions to mitigate this death causing disease. In this work, we propose diagnosis algorithm for malaria which is implemented for testing and evaluation in Matlab. We use Filtering and classification along with median filter and SVM classifier. Our proposed method identifies the infected cells from rest of blood images. The Median filtering smoothing technique is used to remove the noise. The feature vectors have been proposed to find out the abnormalities in blood cells. Feature vectors include (Form factor, measurement of roundness, shape, count total number of red cells and parasites). Primary aim of this research is to diagnose malaria by finding out infected cells. However, many techniques and algorithm have been implemented in this field using image processing but accuracy is not up to the point. Our proposed algorithm got more efficient results along with high accuracy as compared to NCC and Fuzzy classifier used by the researchers recently
Text-Independent Speaker Verification Based on Information Theoretic Learning
In this paper VQ (Vector Quantization) based on information theoretic learning is
investigated for the task of text-independent speaker verification. A novel VQ method
based on the IT (Information Theoretic) principles is used for the task of speaker
verification and compared with two classical VQ approaches: the K-means algorithm
and the LBG (Linde Buzo Gray) algorithm. The paper provides a theoretical background
of the vector quantization techniques, which is followed by experimental results
illustrating their performance. The results demonstrated that the ITVQ (Information
Theoretic Vector Quantization) provided the best performance in terms of classification
rates, EER (Equal Error Rates) and the MSE (Mean Squared Error) compare to Kmeans
and the LBG algorithms. The outstanding performance of the ITVQ algorithm
can be attributed to the fact that the IT criteria used by this algorithm provide superior
matching between distribution of the original data vectors and the codewords
Clustering and Fault Tolerance for Target Tracking using Wireless Sensor Networks
Over the last few years, the deployment of WSNs (Wireless Sensor Networks) has been fostered in
diverse applications. WSN has great potential for a variety of domains ranging from scientific experiments
to commercial applications. Due to the deployment of WSNs in dynamic and unpredictable environments.
They have potential to cope with variety of faults. This paper proposes an energy-aware fault-tolerant
clustering protocol for target tracking applications termed as the FTTT (Fault Tolerant Target Tracking)
protocol. The identification of RNs (Redundant Nodes) makes SN (Sensor Node) fault tolerance plausible
and the clustering endorsed recovery of sensors supervised by a faulty CH (Cluster Head). The FTTT
protocol intends two steps of reducing energy consumption: first, by identifying RNs in the network;
secondly, by restricting the numbers of SNs sending data to the CH. Simulations validate the scalability
and low power consumption of the FTTT protocol in comparison with LEACH protocol
Using Reversed MFCC and IT-EM for Automatic Speaker Verification
This paper proposes text independent automatic speaker verification system using IMFCC (Inverse/
Reverse Mel Frequency Coefficients) and IT-EM (Information Theoretic Expectation Maximization). To
perform speaker verification, feature extraction using Mel scale has been widely applied and has
established better results. The IMFCC is based on inverse Mel-scale. The IMFCC effectively captures
information available at the high frequency formants which is ignored by the MFCC. In this paper the
fusion of MFCC and IMFCC at input level is proposed. GMMs (Gaussian Mixture Models) based on EM
(Expectation Maximization) have been widely used for classification of text independent verification.
However EM comes across the convergence issue. In this paper we use our proposed IT-EM which has
faster convergence, to train speaker models. IT-EM uses information theory principles such as PDE
(Parzen Density Estimation) and KL (Kullback-Leibler) divergence measure. IT-EM acclimatizes the
weights, means and covariances, like EM. However, IT-EM process is not performed on feature vector sets
but on a set of centroids obtained using IT (Information Theoretic) metric. The IT-EM process at once
diminishes divergence measure between PDE estimates of features distribution within a given class and
the centroids distribution within the same class. The feature level fusion and IT-EM is tested for the task
of speaker verification using NIST2001 and NIST2004. The experimental evaluation validates that
MFCC/IMFCC has better results than the conventional delta/MFCC feature set. The MFCC/IMFCC
feature vector size is also much smaller than the delta MFCC thus reducing the computational burden as
well. IT-EM method also showed faster convergence, than the conventional EM method, and thus it leads
to higher speaker recognition scores
Distance Measurement Error Reduction Analysis for the Indoor Positioning System
ABSTRACT INTRODUCTION Sputnikovaya In the paper the introduction is followed by the section wireless communication channel model, which describes the model, its impairments and TOA/TDOA channel profile. In Section 3, the deployment of an indoor positioning system and experimental details are discussed. Section 4 includes the results obtained and finally the paper is concluded in Section 5. WIRELESS COMMUNICATION CHANNEL MODEL We can simply model the wireless channel in frequency where β is called phase factor and represented by: where c is the free space speed of light. The received signal will have a time delay τ which depends on range r. The relation between time delay and the distance can be written as follows: The electric field now can be written as: The signal is received through multiple number of paths in a typical wireless communication system. These paths, shown in A typical wireless channel generally comprises these paths and its CIR can be represented by a tapped delay module, shown i
An Efficient Channel Model for OFDM and Time Domain Single Carrier Transmission Using Impulse Responses
The OFDM (Orthogonal Frequency Division Multiplexing) is well-known, most utilized wideband
communication technique of the current era. SCT (Single Carrier Transmission) provides equivalent
performance in time domain while decision equalizer is implemented in frequency domain. SCT
annihilates the ICT (Inter Carrier Interference) and the PAPR (Peak to Average Power Ratio) which is
inherent to OFDM and degrades its performance in time varying channels. An efficient channel model
is presented in this contribution, to implement OFDM and SCT in time domain using impulse responses.
Both OFDM and SCT models are derived dialectically to model the channel impulse responses. Our
model enhances the performance of time domain SCT compared with OFDM and subsides the PAPR and
ICI problems of OFDM. SCT is implemented at symbol level contained in blocks. Simulation results
implementing Digital Radio Monadiale (DRM) assert the performance gain of SCT over OFDM
A Novel Method to Implement the Matrix Pencil Super Resolution Algorithm for Indoor Positioning
This article highlights the estimation of the results for the algorithms implemented in order to estimate the delays and distances for the indoor positioning system. The data sets for the transmitted and received signals are captured at a typical outdoor and indoor area. The estimation super resolution algorithms are applied. Different state of art and super resolution techniques based algorithms are applied to avail the optimal estimates of the delays and distances between the transmitted and received signals and a novel method for matrix pencil algorithm is devised. The algorithms perform variably at different scenarios of transmitted and received positions. Two scenarios are experienced, for the single antenna scenario the super resolution techniques like ESPRIT (Estimation of Signal Parameters via Rotational Invariance Technique) and theMatrix Pencil algorithms give optimal performance compared to the conventional techniques. In two antenna scenario RootMUSIC and Matrix Pencil algorithm performed better than other algorithms for the distance estimation, however, the accuracy of all the algorithms is worst than the single antenna scenario. In all cases our devised Matrix Pencil algorithm achieved the best estimation results