134 research outputs found
Algorithms design for improving homecare using Electrocardiogram (ECG) signals and Internet of Things (IoT)
Due to the fast growing of population, a lot of hospitals get crowded from the huge amount of
patients visits. Moreover, during COVID-19 a lot of patients prefer staying at home to minimize
the spread of the virus. The need for providing care to patients at home is essential. Internet
of Things (IoT) is widely known and used by different fields. IoT based homecare will help
in reducing the burden upon hospitals. IoT with homecare bring up several benefits such as
minimizing human exertions, economical savings and improved efficiency and effectiveness. One
of the important requirement on homecare system is the accuracy because those systems are
dealing with human health which is sensitive and need high amount of accuracy. Moreover,
those systems deal with huge amount of data due to the continues sensing that need to be
processed well to provide fast response regarding the diagnosis with minimum cost requirements.
Heart is one of the most important organ in the human body that requires high level of caring.
Monitoring heart status can diagnose disease from the early stage and find the best medication
plan by health experts. Continues monitoring and diagnosis of heart could exhaust caregivers
efforts. Having an IoT heart monitoring model at home is the solution to this problem. Electrocardiogram
(ECG) signals are used to track heart condition using waves and peaks. Accurate
and efficient IoT ECG monitoring at home can detect heart diseases and save human lives.
As a consequence, an IoT ECG homecare monitoring model is designed in this thesis for detecting
Cardiac Arrhythmia and diagnosing heart diseases. Two databases of ECG signals are used;
one online which is old and limited, and another huge, unique and special from real patients
in hospital. The raw ECG signal for each patient is passed through the implemented Low
Pass filter and Savitzky Golay filter signal processing techniques to remove the noise and any
external interference. The clear signal in this model is passed through feature extraction stage
to extract number of features based on some metrics and medical information along with feature extraction algorithm to find peaks and waves. Those features are saved in the local database to
apply classification on them. For the diagnosis purpose a classification stage is made using three
classification ways; threshold values, machine learning and deep learning to increase the accuracy.
Threshold values classification technique worked based on medical values and boarder lines. In
case any feature goes above or beyond these ranges, a warning message appeared with expected
heart disease. The second type of classification is by using machine learning to minimize the
human efforts. A Support Vector Machine (SVM) algorithm is proposed by running the algorithm
on the features extracted from both databases. The classification accuracy for online and hospital
databases was 91.67% and 94% respectively. Due to the non-linearity of the decision boundary, a
third way of classification using deep learning is presented. A full Multilayer Perceptron (MLP)
Neural Network is implemented to improve the accuracy and reduce the errors. The number of
errors reduced to 0.019 and 0.006 using online and hospital databases.
While using hospital database which is huge, there is a need for a technique to reduce the amount
of data. Furthermore, a novel adaptive amplitude threshold compression algorithm is proposed.
This algorithm is able to make diagnosis of heart disease from the reduced size using compressed
ECG signals with high level of accuracy and low cost. The extracted features from compressed
and original are similar with only slight differences of 1%, 2% and 3% with no effects on machine
learning and deep learning classification accuracy without the need for any reconstructions. The
throughput is improved by 43% with reduced storage space of 57% when using data compression.
Moreover, to achieve fast response, the amount of data should be reduced further to provide
fast data transmission. A compressive sensing based cardiac homecare system is presented.
It gives the channel between sender and receiver the ability to carry small amount of data.
Experiment results reveal that the proposed models are more accurate in the classification of
Cardiac Arrhythmia and in the diagnosis of heart diseases. The proposed models ensure fast
diagnosis and minimum cost requirements. Based on the experiments on classification accuracy,
number of errors and false alarms, the dictionary of the compressive sensing selected to be 900.
As a result, this thesis provided three different scenarios that achieved IoT homecare Cardiac
monitoring to assist in further research for designing homecare Cardiac monitoring systems. The experiment results reveal that those scenarios produced better results with high level of accuracy
in addition to minimizing data and cost requirements
Recommended from our members
Understanding the usage of Mobile Payment Systems- the impact of personality on the continuance usage
Payment convenience has benefited from the revolution in mobile technologies. M-payment users, however, seem inconsistent in their payment activity, resisting change from traditional payment methods. Ensuring consumer continuance of m-payment technology usage is critical to ensuring the ubiquity of m-payment solutions. Although research has examined the influence of individual difference on the acceptance of m-payment, most studies fail to consider whether ongoing acceptance is maintained by the user, or whether a change in perception occurs as a result of use. Moreover, current studies consider user demographic profiles to segment mobile users, yet this dismisses the impact of individual difference, e.g. personality or cognitive style. This paper proposes a model that can be used to investigate the impact of individual difference on user perception of m-payment systems. The Expectation Confirmation Model (ECM) and Unified Theory of Acceptance and Use of Technology (UTAUT2) model factors (i.e. effort expectancy, performance expectancy, social influence, facilitating condition, habit, hedonic motivation, price value, trust and perceived risk) allow capture of data relating to two use perceptions; pre- and post-usage perception. The proposed model allows capture and comparison of pre-usage expectation and post-usage beliefs, allowing consideration of perception variance as a result of technology use. This model will be applied to gain a deeper insight into how to address users’ satisfaction, acceptance, and continuance usage of Near Field Communication m-payment technologies
A novel Automatic Optic Disc and Cup Image Segmentation System for Diagnosing Glaucoma using RIGA dataset
The optic nerve head (ONH) of the retina is a very important landmark of the fundus and is altered in optic nerve pathology especially glaucoma. Numerous imaging systems are available to capture the retinal fundus and from which some structural parameters can be inferred the retinal fundus camera is one of the most important tools used for this purpose. Currently, the ONH structure examination of the fundus images is conducted by the professionals only by observation. It should be noted that there is a shortage of highly trained professional worldwide. Therefore a reliable and efficient optic disc and cup localization and segmentation algorithms are important for automatic eye disease screening and also for monitoring the progression/remission of the disease Thus in order to develop a system, a retinal fundus image dataset is necessary to train and test the new software systems.
The methods for diagnosing glaucoma are reviewed in the first chapter. Various datasets of retinal fundus images that are publically available currently are described and discussed. In the second chapter the techniques for the optic disc and cup segmentations available in the literature is reviewed. While in the third chapter a unique retinal fundus image dataset, called RIGA (retinal images for glaucoma analysis) is presented. In the dataset, the optic disc and cup boundaries are annotated manually by 6 ophthalmologists (glaucoma professionals) independently for total of 4500 images in order to obtain a comprehensive view point as well as to see the variation and agreement between these professionals. Based upon these evaluations, some of the images were filtered based on a statistical analysis in order to increase the reliability. The new optic disc and cup segmentation methodologies are discussed in the fourth chapter. The process starts with a preprocessing step based on a reliable and precise algorithm. Here an Interval Type-II fuzzy entropy based thresholding scheme along with Differential Evolution was applied to determine the location of the optic disc in order to determine the region of interest instead of dealing with the entire image. Then, the processing step is discussed. Two algorithms were applied: one for optic disc segmentation based on an active contour model implemented by level set approach, and the second for optic cup segmentation. For this thresholding was applied to localize the disc. The disc and cup area and centroid are then calculated in order to evaluate them based on the manual annotations of areas and centroid for the filtered images based on the statistical analysis. In the fifth chapter, after segmenting the disc and cup, the clinical parameters in diagnosis of glaucoma such as horizontal and vertical cup to disc ratio (HCDR) and (VCDR) are computed automatically as a post processing step in order to compare the results with the six ophthalmologist’s manual annotations results. The thesis is concluded in chapter six with discussion of future plans
An Automatic Image Processing System for Glaucoma Screening
Horizontal and vertical cup to disc ratios are the most crucial parameters used clinically to detect glaucoma or monitor its progress and are manually evaluated from retinal fundus images of the optic nerve head. Due to the rarity of the glaucoma experts as well as the increasing in glaucoma’s population, an automatically calculated horizontal and vertical cup to disc ratios (HCDR and VCDR, resp.) can be useful for glaucoma screening. We report on two algorithms to calculate the HCDR and VCDR. In the algorithms, level set and inpainting techniques were developed for segmenting the disc, while thresholding using Type-II fuzzy approach was developed for segmenting the cup. The results from the algorithms were verified using the manual markings of images from a dataset of glaucomatous images (retinal fundus images for glaucoma analysis (RIGA dataset)) by six ophthalmologists. The algorithm’s accuracy for HCDR and VCDR combined was 74.2%. Only the accuracy of manual markings by one ophthalmologist was higher than the algorithm’s accuracy. The algorithm’s best agreement was with markings by ophthalmologist number 1 in 230 images (41.8%) of the total tested images
PROJECT DELIVERY SYSTEM DECISION FRAMEWORK USING THE WEIGHTING FACTORS AND ANALYTIC HIERARCHY PROCESS METHODS
There is a range of contract types and project delivery systems (PDS) that owners can use in executing facilities. Examples include the traditional Design-Bid-Build (DBB) process, Design-Build (DB) and Construction Management-at-Risk (CM-R). A number of owners in Saudi Arabia, particularly governments, prefer some form of competitive bidding (typically the DBB method), and most of the time they insist on it. However, the use of non-traditional delivery systems is increasing, and the system variations are becoming numerous. The selection of project delivery system influences the entire life-cycle of a construction project, from concept through construction into operation and decommissioning. Owners, engineers, contractors, material suppliers and laborers are all affected by the decisions that owners make concerning project delivery systems. Owners need to assess what type of construction services procurement program is best suited to their needs. Selecting a PDS means choosing the best delivery system to carry out a particular project, which is not always an easy and clear decision. The success or failure of a project can depend on the project delivery method, and whether the method is suited to the project. There are many factors and parameters or key considerations, such as cost (budget), time (schedule), quality (level of expertise), risk assessment (responsibility) and safety which determine whether a particular style of PDS is suited to a project. A model is a representation of a real or planned system and can be used as an aid in choosing a PDS. The purpose of this research is to try to develop a project delivery system decision framework (PDSDF) by identifying the factors and parameters that have to be considered in such a model. A survey was conducted to determine the values of factors and key parameters from completed projects. The research attempts to identify patterns of project factors, owner objectives, and project parameters that could best be met by one or another PDS. This model is intended to be very easy for owners to use, while at the same time providing meaningful results that can be used in making a selection of a suitable project delivery system.A weighting factors approach and the analytic hierarchy process (AHP) was used to construct the decision framework. In this process the relative advantages of the three project delivery systems are compared according to each criterion. The relative importance of the criterion is determined on the basis of the owner's needs and project characteristics. The results of comparing the three delivery systems according to each criterion and of determining the order of importance among the criteria were integrated into a model to help the owner reach a decision about which project delivery system he should adopt
Relationships between emotional intelligence and sales performance in Kuwait
This study investigates the relationship between emotional intelligence (EI) and Total Sales Performance (TSP), and whether EI contributes to predicting the performance of sales professionals in Kuwait. The sample was 218 sales professionals working for 24 different car dealerships. An ability model of EI was measured using the Assessing Emotions Scale (AES) developed by Schutte et al. (1998) and its Arabic version. The trait model of EI was assessed using the Effective Intelligence Scale (EIS). The findings showed a negative but weak correlation between TSP and the AES and all its subscales. No correlation was found between TSP and the EIS. A weak positive correlation existed between Objective Sales Performance and each of total EIS, Accuracy, and Patience subscales
Recent developments in chemical reactivity of N,N-dimethylenamino ketones as synthons for various heterocycles
The current review presents recent progress in the utility of N,N-dimethyl enaminones as building blocks for a diverse range of acyclic, carbocyclic, five- and six-membered heterocyclic a broad range of heterocyclic and fused heterocyclic derivatives. Most importantly, these N,N-dimethyl analogues have proven to be of biological interest and provide an access to new class of biologically active heterocyclic compounds for biomedical applications. All of these topics are drawn from the recent literature till 2016
Burden of musculoskeletal disorders in the Eastern Mediterranean Region, 1990-2013: findings from the Global Burden of Disease Study 2013.
OBJECTIVES: We used findings from the Global Burden of Disease Study 2013 to report the burden of musculoskeletal disorders in the Eastern Mediterranean Region (EMR). METHODS: The burden of musculoskeletal disorders was calculated for the EMR's 22 countries between 1990 and 2013. A systematic analysis was performed on mortality and morbidity data to estimate prevalence, death, years of live lost, years lived with disability and disability-adjusted life years (DALYs). RESULTS: For musculoskeletal disorders, the crude DALYs rate per 100 000 increased from 1297.1 (95% uncertainty interval (UI) 924.3-1703.4) in 1990 to 1606.0 (95% UI 1141.2-2130.4) in 2013. During 1990-2013, the total DALYs of musculoskeletal disorders increased by 105.2% in the EMR compared with a 58.0% increase in the rest of the world. The burden of musculoskeletal disorders as a proportion of total DALYs increased from 2.4% (95% UI 1.7-3.0) in 1990 to 4.7% (95% UI 3.6-5.8) in 2013. The range of point prevalence (per 1000) among the EMR countries was 28.2-136.0 for low back pain, 27.3-49.7 for neck pain, 9.7-37.3 for osteoarthritis (OA), 0.6-2.2 for rheumatoid arthritis and 0.1-0.8 for gout. Low back pain and neck pain had the highest burden in EMR countries. CONCLUSIONS: This study shows a high burden of musculoskeletal disorders, with a faster increase in EMR compared with the rest of the world. The reasons for this faster increase need to be explored. Our findings call for incorporating prevention and control programmes that should include improving health data, addressing risk factors, providing evidence-based care and community programmes to increase awareness
- …