10 research outputs found

    Repositioning the Logistic Industry for Effective Service Delivery in Nigeria: A Case Study

    Get PDF
    The logistic sector remains one silent but vital component of the transportation sector in Nigeria. In developed countries of the world, the use of information and communication technology has brought tremendous transformation to different workplaces including the logistic industry in terms of speed, convenience, efficiency and wealth creation. Unfortunately, processes in the courier services in Nigeria are still being largely managed manually with its attendant security challenges, waste of time, storage etc. Through a system development life cycle methodology, this work developed a computerized courier mail tracking and management system with enhanced features using PHP and MySQL database. It was hosted on a WampServer and successfully implemented for Suvex Delivery Services Limited (SDSL).The platform does not only guarantee that processes in the logistics section are more secured and better organized for tracking and retrieval of relevant information, it ensures that effectiveness and efficiency remain the watch word in the logistics industry

    A Machine Learning Framework for Length of Stay Minimization in Healthcare Emergency Department

    Get PDF
    The emergency departments (EDs) in most hospitals, especially in middle-and-low-income countries, need techniques for minimizing the waiting time of patients. The application and utilization of appropriate methods can enhance the number of patients treated, improve patients’ satisfaction, reduce healthcare costs, and lower morbidity and mortality rates which are often associated with poor healthcare facilities, overcrowding, and low availability of healthcare professionals.  Modeling the length of stay (LOS) of patients in healthcare systems is a challenge that must be addressed for sound decision-making regarding capacity planning and resource allocation. This paper presents a machine learning (ML) framework for predicting a patient’s LOS within the ED. A study of the services in the ED of a tertiary healthcare facility in Uyo, Nigeria was conducted to gain insights into its operational procedures and evaluate the impact of certain parameters on LOS. Then, a computer simulation of the system was performed in R programming language using data obtained from records in the hospital. Finally, the performance of four ML classifiers involved in patients’ LOS prediction: Classification and Regression Tree (CART), Random Forest (RF), K-Nearest Neighbour (K-NN), and Support Vector Machine (SVM), were evaluated and results indicate that SVM outperforms others with the highest coefficient of determination (R2) score of 0.986984 and least mean square error (MSE) value of 0.358594. The result demonstrates the capability of ML techniques to effectively assess the performance of healthcare systems and accurately predict patients’ LOS to mitigate the low physician-patient ratio and improve throughput

    Multi-layer Perceptron Model for Mitigating Distributed Denial of Service Flood Attack in Internet Kiosk Based Electronic Voting

    Get PDF
    Distributed Denial-of-Service (DDoS) flood attack targeting an Internet Kiosk voting environment can deprive voters from casting their ballots in a timely manner. The goal of the DDoS flood attack is to make voting server unavailable to voters during election process. In this paper, we present a Multilayer Perceptron (MLP) algorithm to mitigate DDoS flood attack in an e-voting environment and prevent such attack from disrupting availability of the vulnerable voting server. The developed intelligent DDoS flood mitigation model based on MLP Technique was simulated in MATLAB R2017a. The mitigation model was evaluated using server utilization performance metrics in e-voting. The results after the introduction of the developed mitigation model into the DDoS attack model reduced the server utilization from 1 to 0.4 indicating normal traffic. MLP showed an accuracy of 95% in mitigating DDoS flood attacks providing availability of voting server resources for convenient and timely casting of ballots as well as provide for credible delivery of electronic democratic decision making

    ETEASH-An Enhanced Tiny Encryption Algorithm for Secured Smart Home

    Get PDF
    The proliferation of the "Internet of Things" (IoT) and its applications have affected every aspect of human endeavors from smart manufacturing, agriculture, healthcare, and transportation to homes. The smart home is vulnerable to malicious attacks due to memory constraint which inhibits the usage of traditional antimalware and antivirus software. This makes the application of traditional cryptography for its security impossible. This work aimed at securing Smart home devices, by developing an enhanced Tiny Encryption Algorithm (TEA). The enhancement on TEA was to get rid of its vulnerabilities of related-key attacks and weakness of predictable keys to be usable in securing smart devices through entropy shifting, stretching, and mixing technique. The Enhanced Tiny Encryption Algorithm for Smart Home devices (ETEASH) technique was benchmarked with the original TEA using the Runs test and avalanche effect. ETEASH successfully passed the Runs test with the significance level of 0.05 for the null hypothesis, and the ETEASH avalanche effect of 58.44% was achieved against 52.50% for TEA. These results showed that ETEASH is more secured in securing smart home devices than the standard TEA

    Reviewing Information Systems Usage and Performance Models

    No full text
    A journal article by Dr. Clive Tsuma Katiba, an adjunct faculty at USIU-A.Evaluating technology usage and performance has been a major challenge in the Information Systems (IS) field. Several successful attempts have been made by information system researchers aimed at building and testing theories that explain the impacts of these technologies. The constantly changing contexts of information technology continue to throw-up deficiencies with the existing models. Through a methodological review of existing models that studies human interactions with systems, prominent IS theories like the technology acceptance model (TAM), theory of reasoned action (TRA), technology-to-performance model etc were reviewed in this study. The findings of this study showed the following deficiencies: (a) some of these theories either focuses on intention to use information systems only, usage or performance. (b) None of these theories included the construct satisfaction from the post-usage dimension. (c) None of the models used constructs such as computer self-efficacy and TAM’s external constructs as precursors of utilization. (d) The high level of unexplained variance associated with the current models. This study therefore present a hybrid IS model for evaluating key factors predicting IS usage, satisfaction and performance in a mandatory e-learning usage environment. The implications for end-users, institutions and software developers is the availability of a framework for the assessment of current and future systems in terms of attitude towards use, IT usage, end-user satisfaction and performance

    Implementing an Enhanced Procurement Management System Using Decision Support Techniques

    No full text
    Modeling procurement management system is important for quality decision making regarding business capacity planning, supply and scheduling. In the Procurement Services Department (PSD), most commonly used indicators to measuring performance include supply period, products rating, ranking recommendations, resource scheduling and number of goods supplied and delivered. In this paper, a Decision Support Service (DSS) technique was proposed to optimize procurement services in business and public organization. First, a study of a typical public and private organization was conducted to gain insights into the operations of the procurement department and the contributions of important system parameters to procurement management. Second, a product and supplier collaborative filtering technique were investigated to obtain transformed data for model training and testing for implementation using Partial Correlation Coefficient (PCC). The Partial Correlation Coefficient (PCC) for a particular product or supplier was utilized for generating the outcomes with tuned values which were compared with actual observed outcomes. The residuals were evaluated in terms of linearity, normality, independence and constant variance. The visualized system plots indicate a good performance as the quality and accuracy of the decision support model was evaluated using some basic metrics. The overall system implementation and performance results demonstrated the importance of Decision Support Services in assessing the performance of procurement management systems. A robust tool for this assessment and a model for procurement and supply planning indicates that the system framework offered Quality of Service (QoS) provisioning

    Windows Firewall Bypassing Techniques: An Overview of HTTP Tunneling and Nmap Evasion

    No full text
    Internettechnologyhasbroughtaboutsignificantimprovementineco- nomical drive thereby making automated processes the new norm. With this new technological drive comes the upsurge in criminal activities as technology has proved to be a densely crime-perpetrated territory. Operating Systems (OS) have had their fair share of this debacle with significant updates being pushed out regularly to mitigate threats. Particularly, the windows OS has the firewall feature which has been a huge success in Intrusion Prevention and Detection systems. The Windows 10 version of the OS will always have significant patches and updates regularly to mitigate security threats. However, there have been several techniques and experiments that proves that firewalls are not sufficient enough for system protection. Advanced techniques in firewall evasions are new generation firewall mechanisms with a combination of techniques usually used to bypass standard security tools, such as intrusion detection and prevention systems, which might detect a protection mechanism. This singular fact that the use of multiple com- binations of simpler components is possible, hundreds of thousands of potential Advanced Evasion Techniques exists. This paper therefore takes an overview two of the most significant techniques when it comes to bypassing firewalls - HTTP Tunneling and Nmap Evasion. A comparative study of both techniques helps us look at their similarities and differences and future works

    ETEASH-An Enhanced Tiny Encryption Algorithm for Secured Smart Home

    No full text
    The proliferation of the "Internet of Things" (IoT) and its applications have affected every aspect of human endeavors from smart manufacturing, agriculture, healthcare, and transportation to homes. The smart home is vulnerable to malicious attacks due to memory constraint which inhibits the usage of traditional antimalware and antivirus software. This makes the application of traditional cryptography for its security impossible. This work aimed at securing Smart home devices, by developing an enhanced Tiny Encryption Algorithm (TEA). The enhancement on TEA was to get rid of its vulnerabilities of related-key attacks and weakness of predictable keys to be usable in securing smart devices through entropy shifting, stretching, and mixing technique. The Enhanced Tiny Encryption Algorithm for Smart Home devices (ETEASH) technique was benchmarked with the original TEA using the Runs test and avalanche effect. ETEASH successfully passed the Runs test with the significance level of 0.05 for the null hypothesis, and the ETEASH avalanche effect of 58.44% was achieved against 52.50% for TEA. These results showed that ETEASH is more secured in securing smart home devices than the standard TEA

    A Dataset-Driven Parameter Tuning Approach for Enhanced K-Nearest Neighbour Algorithm Performance

    No full text
    The number of Neighbours (k) and distance measure (DM) are widely modified for improved kNN performance. This work investigates the joint effect of these parameters in conjunction with dataset characteristics (DC) on kNN performance. Euclidean; Chebychev; Manhattan; Minkowski; and Filtered distances, eleven k values, and four DC, were systematically selected for the parameter tuning experiments. Each experiment had 20 iterations, 10-fold cross-validation method and thirty-three randomly selected datasets from the UCI repository. From the results, the average root mean squared error of kNN is significantly affected by the type of task (p9000, as optimal performance pattern for classification tasks. For regression problems, the experimental configuration should be7000≤SS≤9000; 4≤number of attributes ≤6, and DM = 'Filtered'. The type of task performed is the most influential kNN performance determinant, followed by DM. The variation in kNN accuracy resulting from changes in k values only occurs by chance, as it does not depict any consistent pattern, while its joint effect of k value with other parameters yielded a statistically insignificant change in mean accuracy (p>0.5). As further work, the discovered patterns would serve as the standard reference for comparative analytics of kNN performance with other classification and regression algorithms
    corecore