197 research outputs found

    Role of artificial intelligence in cloud computing, IoT and SDN: Reliability and scalability issues

    Get PDF
    Information technology fields are now more dominated by artificial intelligence, as it is playing a key role in terms of providing better services. The inherent strengths of artificial intelligence are driving the companies into a modern, decisive, secure, and insight-driven arena to address the current and future challenges. The key technologies like cloud, internet of things (IoT), and software-defined networking (SDN) are emerging as future applications and rendering benefits to the society. Integrating artificial intelligence with these innovations with scalability brings beneficiaries to the next level of efficiency. Data generated from the heterogeneous devices are received, exchanged, stored, managed, and analyzed to automate and improve the performance of the overall system and be more reliable. Although these new technologies are not free of their limitations, nevertheless, the synthesis of technologies has been challenged and has put forth many challenges in terms of scalability and reliability. Therefore, this paper discusses the role of artificial intelligence (AI) along with issues and opportunities confronting all communities for incorporating the integration of these technologies in terms of reliability and scalability. This paper puts forward the future directions related to scalability and reliability concerns during the integration of the above-mentioned technologies and enable the researchers to address the current research gaps

    Impact Of Missing Data Imputation On The Fairness And Accuracy Of Graph Node Classifiers

    Full text link
    Analysis of the fairness of machine learning (ML) algorithms recently attracted many researchers' interest. Most ML methods show bias toward protected groups, which limits the applicability of ML models in many applications like crime rate prediction etc. Since the data may have missing values which, if not appropriately handled, are known to further harmfully affect fairness. Many imputation methods are proposed to deal with missing data. However, the effect of missing data imputation on fairness is not studied well. In this paper, we analyze the effect on fairness in the context of graph data (node attributes) imputation using different embedding and neural network methods. Extensive experiments on six datasets demonstrate severe fairness issues in missing data imputation under graph node classification. We also find that the choice of the imputation method affects both fairness and accuracy. Our results provide valuable insights into graph data fairness and how to handle missingness in graphs efficiently. This work also provides directions regarding theoretical studies on fairness in graph data.Comment: Accepted at IEEE International Conference on Big Data (IEEE Big Data

    An Efficient Classification of Emotions in Students\u27 Feedback using Deep Neural Network

    Get PDF
    Background and Objective: In both the corporate and academic worlds, the collection and analysis of feedback (product evaluation, social media debate, and student input) has long been a significant topic. The traditional approaches to collect student feedback focused on data collection and analysis via questionnaires. However, the student makes comments on social media sites that need to be looked at to improve educational standards at schools.Methods: The purpose of this work is to construct a deep neural network-based system to assess students\u27 feedback and emotions found in the reviews. Our approach applies a Deep Learning-based Bi-LSTM Model to a benchmark student input dataset. It would categorize students\u27 feedback about their instructors according to their emotional states, such as love, happiness, fury, and disdain.Results: The experimental findings demonstrate that the proposed approach outperforms both benchmark studies and state-of-the-art machine learning classifiers

    Cytotoxicity, Morphology and Chemical Composition of Two Luting Cements: An in Vitro Study

    Get PDF
    Objective: To assess the cytotoxicity, surface morphology, elemental compositions and chemical characterization of two commonly used luting cement. Material and Methods: The two luting types of cement used were Elite Cement® and Hy-Bond Resiglass®. Freshly mixed (n=6) and set form (n=6) of each cement was placed in medium to obtain extracts. The extract from each sample was exposed to L929 mouse fibroblasts (1x104cells/well). Alamar Blue Assay assessed cell viability. Surface morphology and elemental composition were evaluated using scanning electron microscopy and energy dispersive spectroscopy. The chemical characterization was performed by Fourier Transform Infrared Spectroscopy. One-way ANOVA and post-hoc Tukey analysis were conducted to assess results. Results: Hy-Bond Resiglass® was the more cytotoxic of the two types of cement in both freshly mixed (68.10 +5.16; p<0.05) and set state (87.58 +4.86; p<0.05), compared to Elite Cement® both freshly mixed (77.01 +5.45; p<0.05) and set state (89.39 +5.66; p<0.05). Scanning electron microscopy revealed a more irregular and porous structure in Hy-Bond Resiglass® compared to Elite Cement®. Similarly, intense peaks of aluminium, tungsten and fluorine were observed in energy dispersive spectroscopy in Hy-Bond Resiglass. Conclusion: All these three elements (aluminium, tungsten and fluorine) have cytotoxic potential. The Fourier transform infrared spectroscopy revealed the presence of hydroxyethyl methacrylate in Hy-Bond Resiglass®, which has a cytotoxic potential

    Singly-fed rectangular dielectric resonator antenna with a wide circular polarization bandwidth and beamwidth for WiMAX/satellite applications

    Get PDF
    A rectangular dielectric resonator antenna (DRA) has been excited using a unique conformal H-shaped metal strip. Using this excitation, the degenerate mode pair of first higher orderTE d13 x and TE 1d3 y has been excited for bandwidth improvement and high gain. A broadband circular polarization (CP) over a bandwidth of -20% (3.67-4.4 GHz) in conjunction with a wide impedance-matching bandwidth of -27.7% (3.67-4.73 GHz) has been achieved. A CP beamwidth of 89° has been offered by the antenna in F = 0° plane and -32° in F = 90° plane. A high gain of -6.8 dBic has been provided by the antenna, which is a significant improvement to those circularly polarized rectangular DRAs reported in the literature for similar applications. This broad CP bandwidth and beamwidth can be considerably beneficial for the worldwide interoperability for microwave access (WiMAX) and satellite applications. Furthermore, the proposed antenna has been fabricated to validate the simulated results. The measured results have been observed to agree well with the simulated results

    Compliance with continuous positive airway pressure (CPAP) therapy for obstructive sleep apnea among privately paying patients- a cross sectional study

    Get PDF
    Background: To evaluate the compliance, benefits and side effects associated with continuous positive airway pressure (CPAP) therapy among Pakistani patients treated for obstructive sleep apnea (OSA) in private sector.Methods: Patients diagnosed to have OSA based on overnight study who were recommended for CPAP therapy, between 1998 and 2003, were evaluated by telephonic survey and review of hospital notes. Compliance, benefits and side effects associated with CPAP therapy were assessed.Results: Out of 135 patients who were prescribed CPAP therapy, 75 could be contacted. Sixty (80%) started using CPAP within one month of diagnosis and 46 (61%) continued to use it long-term (beyond one year). Compliance with CPAP therapy was associated with higher body mass index, higher Epworth sleepiness scale score, history of witnessed apnea, and reduction in daytime sleepiness with CPAP therapy. OSA severity as assessed by apnea-hypopnea index did not affect compliance with CPAP therapy. Use of anti-depressants and CPAP induced sleep disturbances were associated with poor compliance with CPAP therapy.CONCLUSIONS: Obesity, excessive daytime sleepiness, witnessed apnea and improvement of daytime symptoms following use of CPAP were predictors of improved compliance. Use of antidepressants and CPAP induced sleep disturbances were predictors of poor compliance

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care

    Error corrected location determination in an outdoor wireless environment by using estimation, filtering, prediction and fusion techniques : A wifi application by using terrain based knowledge

    No full text
    L’estimation de la position des noeuds de communication sans fil est un domaine de recherche très populaire au cours des dernières années. La recherche de l'estimation de l'emplacement n'est pas limitée à la communication par satellite, mais elle concerne aussi les réseaux WLAN, MANET, WSN et la communication cellulaire. En raison de la croissance et de l’évolution des architectures de communication cellulaire, l’utilisation des appareils portables a augmenté rapidement, et les appels provenant d’utilisateurs mobiles ont également rapidement augmenté. On estime que plus de 50% des appels d'urgence sont émis par des téléphones mobiles. Les chercheurs ont utilisé différentes techniques d'estimation de la position, tels que les satellites, les techniques statistiques et la cartographie. Afin d'atteindre une meilleure précision, certains chercheurs ont combiné deux ou plusieurs techniques. Cependant, l'estimation de la position basée sur le terrain est un domaine qui n'a pas été considéré en profondeur par les chercheurs. Grâce aux ondes radio qui se comportent différemment dans des atmosphères différentes, les calculs à l’aide de quelques paramètres ne sont pas suffisants pour atteindre une précision avec différents terrains, surtout quand il sont totalement basés sur le format RSS, qui entraine des altérations.Cette recherche se concentre sur la localisation des noeuds de communication sans fil en utilisant des techniques géométriques et statistiques, et en prenant en compte l’altération et l'atténuation des terrains. Le modèle proposé est constitué de quatre étapes, qui sont l'estimation, le filtrage, la prédiction et la fusion. Un prototype a été construit en utilisant le WiFi IEEE 802.11x standard. Dans la première étape, en utilisant le rapport signal-bruit de la zone géographique, la péninsule Malaisienne est classée en 13 types de terrains différents.Dans la deuxième étape, les points de données point-à-point sont enregistrés en utilisant la force du signal disponible et en recevant la puissance du signal en considérant différents types de terrains. L’estimation de la localisation se fait au cours de troisième étape en utilisant la célèbre méthode de triangulation. Les résultats estimés sont encore filtrés dans la quatrième étape en utilisant la moyenne et la moyenne des moyennes. Pour la correction des erreurs, le filtrage de l'emplacement est également fait en utilisant la règle des plus proches voisins. La prédiction est affinée au cours de la cinquième étape en utilisant la variance combinée qui permet de prédire la région considérée. L’utilisation des régions d'intérêt contribue à éliminer les emplacements situés à l'extérieur de la zone sélectionnée. Au cours de la sixième étape, les résultats du filtrage sont fusionnés avec la prédiction afin d'obtenir une meilleure précision.Les résultats montrent que les recherches effectuées permettent de réduire les erreurs de 18 m à 6 m dans des terrains fortement atténués, et de 3,5 m à 0,5 m dans des terrains faiblement atténués.Location estimation of wireless nodes has been a very popular research area for the past few years. The research in location estimation is not limited to satellite communication, but also in WLAN, MANET, WSN and Cellular communication. Because of the growth and advancement in cellular communication architecture, the usage of handheld devices has increased rapidly, therefore mobile users originating calls are also increasing. It is estimated that more than 50% emergency calls are originated by mobile phones. Researchers have used different location estimation techniques, such as satellite based, geometrical, statistical and mapping techniques. In order to achieve accuracy, researchers have combined two or more techniques. However the terrain based location estimation is an area which is not considered by researchers extensively.Due to the fact that radio waves behave differently in different atmospheres, the calculation of few parameters is not sufficient to achieve accuracy in different terrains, especially when it is totally based on RSS which is carrying impairments.This research is focusing on the localization of wireless nodes by using geometrical and statistical techniques with the consideration of impairment/attenuation of terrains. The proposed model is consisting of four steps, which are estimation, filtering, prediction and fusion. A prototype has been built using the WiFi IEEE 802.11x standard. In step one, by using signal to noise ratio, the peninsular Malaysia geographical area is categorized into 13 different terrains/clutters. In step two, point-to-point data points are recorded by using available signal strength and receive signal strength with the consideration of different terrains. Estimation of the location is done in step three by using the triangulation method. The results of estimated locations are further filtered in step four by using average and mean of means. For error correction, filtering of the location is also done by using k- nearest neighbor rule. Prediction is done in step five by using combined variance which predicts the region of interest. Region of interest helps to eliminate locations outside of the selected area. In step six filtering results are fused with prediction in order to achieve accuracy. Results show that the current research is capable of reducing errors from 18 m to 6 m in highly attenuated terrains and from 3.5 m to 0.5 m in low attenuated terrains

    Corrigé de localisation dans un environment extérieur sans fil en utilisant estimation, filtrage, la prévision et des techniques de fusion : une application par wifi utilisant le terrain à base de connaissances

    No full text
    Location estimation of wireless nodes has been a very popular research area for the past few years. The research in location estimation is not limited to satellite communication, but also in WLAN, MANET, WSN and Cellular communication. Because of the growth and advancement in cellular communication architecture, the usage of handheld devices has increased rapidly, therefore mobile users originating calls are also increasing. It is estimated that more than 50% emergency calls are originated by mobile phones. Researchers have used different location estimation techniques, such as satellite based, geometrical, statistical and mapping techniques. In order to achieve accuracy, researchers have combined two or more techniques. However the terrain based location estimation is an area which is not considered by researchers extensively.Due to the fact that radio waves behave differently in different atmospheres, the calculation of few parameters is not sufficient to achieve accuracy in different terrains, especially when it is totally based on RSS which is carrying impairments.This research is focusing on the localization of wireless nodes by using geometrical and statistical techniques with the consideration of impairment/attenuation of terrains. The proposed model is consisting of four steps, which are estimation, filtering, prediction and fusion. A prototype has been built using the WiFi IEEE 802.11x standard. In step one, by using signal to noise ratio, the peninsular Malaysia geographical area is categorized into 13 different terrains/clutters. In step two, point-to-point data points are recorded by using available signal strength and receive signal strength with the consideration of different terrains. Estimation of the location is done in step three by using the triangulation method. The results of estimated locations are further filtered in step four by using average and mean of means. For error correction, filtering of the location is also done by using k- nearest neighbor rule. Prediction is done in step five by using combined variance which predicts the region of interest. Region of interest helps to eliminate locations outside of the selected area. In step six filtering results are fused with prediction in order to achieve accuracy. Results show that the current research is capable of reducing errors from 18 m to 6 m in highly attenuated terrains and from 3.5 m to 0.5 m in low attenuated terrains.L’estimation de la position des noeuds de communication sans fil est un domaine de recherche très populaire au cours des dernières années. La recherche de l'estimation de l'emplacement n'est pas limitée à la communication par satellite, mais elle concerne aussi les réseaux WLAN, MANET, WSN et la communication cellulaire. En raison de la croissance et de l’évolution des architectures de communication cellulaire, l’utilisation des appareils portables a augmenté rapidement, et les appels provenant d’utilisateurs mobiles ont également rapidement augmenté. On estime que plus de 50% des appels d'urgence sont émis par des téléphones mobiles. Les chercheurs ont utilisé différentes techniques d'estimation de la position, tels que les satellites, les techniques statistiques et la cartographie. Afin d'atteindre une meilleure précision, certains chercheurs ont combiné deux ou plusieurs techniques. Cependant, l'estimation de la position basée sur le terrain est un domaine qui n'a pas été considéré en profondeur par les chercheurs. Grâce aux ondes radio qui se comportent différemment dans des atmosphères différentes, les calculs à l’aide de quelques paramètres ne sont pas suffisants pour atteindre une précision avec différents terrains, surtout quand il sont totalement basés sur le format RSS, qui entraine des altérations.Cette recherche se concentre sur la localisation des noeuds de communication sans fil en utilisant des techniques géométriques et statistiques, et en prenant en compte l’altération et l'atténuation des terrains. Le modèle proposé est constitué de quatre étapes, qui sont l'estimation, le filtrage, la prédiction et la fusion. Un prototype a été construit en utilisant le WiFi IEEE 802.11x standard. Dans la première étape, en utilisant le rapport signal-bruit de la zone géographique, la péninsule Malaisienne est classée en 13 types de terrains différents.Dans la deuxième étape, les points de données point-à-point sont enregistrés en utilisant la force du signal disponible et en recevant la puissance du signal en considérant différents types de terrains. L’estimation de la localisation se fait au cours de troisième étape en utilisant la célèbre méthode de triangulation. Les résultats estimés sont encore filtrés dans la quatrième étape en utilisant la moyenne et la moyenne des moyennes. Pour la correction des erreurs, le filtrage de l'emplacement est également fait en utilisant la règle des plus proches voisins. La prédiction est affinée au cours de la cinquième étape en utilisant la variance combinée qui permet de prédire la région considérée. L’utilisation des régions d'intérêt contribue à éliminer les emplacements situés à l'extérieur de la zone sélectionnée. Au cours de la sixième étape, les résultats du filtrage sont fusionnés avec la prédiction afin d'obtenir une meilleure précision.Les résultats montrent que les recherches effectuées permettent de réduire les erreurs de 18 m à 6 m dans des terrains fortement atténués, et de 3,5 m à 0,5 m dans des terrains faiblement atténués

    Performance Evaluation of Load-Balancing Algorithms with Different Service Broker Policies for Cloud Computing

    No full text
    Cloud computing has seen a major boom during the past few years. Many people have switched to cloud computing because traditional systems require complex resource distribution and cloud solutions are less expensive. Load balancing (LB) is one of the essential challenges in cloud computing used to balance the workload of cloud services. This research paper presents a performance evaluation of the existing load-balancing algorithms which are particle swarm optimization (PSO), round robin (RR), equally spread current execution (ESCE), and throttled load balancing. This study offers a detailed performance evaluation of various load-balancing algorithms by employing a cloud analyst platform. Efficiency concerning various service broker policy configurations for load-balancing algorithms’ virtual machine load balance was also calculated using metrics such as optimized response time (ORT), data center processing time (DCPT), virtual machine costs, data transfer costs, and total cost for different workloads and user bases. Many of the past papers that were mentioned in the literature worked on round robin and equally spread current execution, and throttled load-balancing algorithms were based on efficiency and response time in virtual machines without recognizing the relation between the task and the virtual machines, and the practical significance of the application. A comparison of specific load-balancing algorithms has been investigated. Different service broker policy (SBP) tests have been conducted to illustrate the load-balancing algorithm capabilities
    corecore