12 research outputs found

    Analysis of cryptocurrencies adoption using fractional grey Lotka-Volterra models

    Get PDF
    Abstract: Solving analytically nonlinear dynamical system in continuous time scale is often problematic. The accumulation generating operations provide a tool of formulating a discrete dynamical form whose properties are relatively close to that of corresponding nonlinear systems. The present study discusses threes versions of 2- and 3- dimensional discrete Lotka-Volterra dynamical system with application to cryptocurrencies adoption. The application is interested on 3 cryptocurrencies namely Bitcoin, Litecoin and Ripple. The 2-dimensional application is on Bitcoin and Litecoin while the 3-dimensional application is on Bitcoin, Litecoin and Ripple. The dataset include records from 28-April-2013 to 10-February-2018 which provide forecasting values for Bitcoin and Litecoin through 2-dimensional study, while records from 7-August-2013 to 10- February-2018 provide forecasting values of Bitcoin, Litecoin and Ripple through 3-dimensional study. The thesis has produced four papers that have been published and presented in international conferences.Ph.D. (Mathematics

    Some statistical methods in analysis of single and multiple events with application to infant mortality data.

    Get PDF
    Doctoral Degree. University of KwaZulu-Natal, Pietermaritzburg.The time to event analysis or survival analysis aims at making inferences on the time elapsed between the recruitment of subjects or the onset of observations, until the occurrence of some event of interest. Methods used in general statistical analysis, in particular in regression analysis, are not directly applicable to time to event data due to covariate correlation, censoring and truncation. While analysing time to event data, medical statistics adopts mainly nonparametric methods due to difficulty in finding the adequate distribution of the phenomenon under study. This study reviews non-parametric classical methods of time to event analysis namely Aalen Additive Hazards Model (AAHM) trough counting and martingale processes, Cox Proportional Hazard Model (CPHM) and Cox-Aalen Hazards Model (CAHM) with application to the infant mortality at Kigali University Teaching Hospital (KUTH) in Rwanda. Proportional hazards assumption (PHA) was checked by assessing Kaplan-Meier estimates of survival functions per groups of covariates. Multiple events models were also reviewed and a model suitable to the dataset was selected. The dataset comprises 2117 newborns and socio-economic and clinical covariates for mothers and children. Two events per subject were modeled namely, the death and the occurrence of at least one of the conditions that may also cause long term death to infants. To overcome the instability of models (also known as checking consistence of models) and potential small sample size, re-sampling was applied to both CPHM and appropriate multiple events model. The popular non-parametric re-sampling methods namely bootstrap and jackknife for the available covariates were conducted and then re-sampled models were compared to the non-re-sampled ones. The results in different models reveal significant and non-significant covariates, the relative risk and related standard error and confidence intervals per covariate. Among the results, it was found that babies from under 20 years old mothers were at relatively higher risk and therefore, pregnancy of under 20 years old mothers should be avoided. It was also found that an infant’s abnormality in weight and head increases the risk of infant mortality, clinically recommended ways of keeping pregnancy against any cause of infant abnormality were then recommended

    Detecting Learning Patterns in Tertiary Education Using K-Means Clustering

    Get PDF
    Abstract: We are in the era where various processes need to be online. However, data from digital learning platforms are still underutilised in higher education, yet, they contain student learning pat- terns, whose awareness would contribute to educational development. Furthermore, the knowledge of student progress would inform educators whether they would mitigate teaching conditions for critically performing students. Less knowledge of performance patterns limits the development of adaptive teaching and learning mechanisms. In this paper, a model for data exploitation to dynamically study students progress is proposed. Variables to determine current students progress are defined and are used to group students into different clusters. A model for dynamic clustering is proposed and related cluster migration is analysed to isolate poorer or higher performing students. K-means clustering is performed on real data consisting of students from a South African tertiary institution. The proposed model for cluster migration analysis is applied and the corresponding learning patterns are revealed

    A Novel Epidemic Model for the Interference Spread in the Internet of Things

    Get PDF
    Due to the multi-technology advancements, internet of things (IoT) applications are in high demand to create smarter environments. Smart objects communicate by exchanging many messages, and this creates interference on receivers. Collection tree algorithms are applied to only reduce the nodes/paths’ interference but cannot fully handle the interference across the underlying IoT. This paper models and analyzes the interference spread in the IoT setting, where the collection tree routing algorithm is adopted. Node interference is treated as a real-life contamination of a disease, where individuals can migrate across compartments such as susceptible, attacked and replaced. The assumed typical collection tree routing model is the least interference beaconing algorithm (LIBA), and the dynamics of the interference spread is studied. The underlying network’s nodes are partitioned into groups of nodes which can affect each other and based on the partition property, the susceptible–attacked–replaced (SAR) model is proposed. To analyze the model, the system stability is studied, and the compartmental based trends are experimented in static, stochastic and predictive systems. The results shows that the dynamics of the system are dependent groups and all have points of convergence for static, stochastic and predictive systems

    Survival analysis and its stochastic process approach with application to diabetes data

    Get PDF
    Abstract: Survival analysis also called time to event analysis aims at making inferences on the life time or the time elapsed between the recruitment of subjects or the onset of observations, until the occurrence of some event of interest. Methods used in general statistical analysis, in particular in regression analysis, are not directly applicable to survival data due to censoring and truncation. This study reviewed nonparametric, semi-parametric, and briefly parametric methods used in classical survival analysis, namely the Kaplan-Meier estimation, the Nelson-Aalen estimation, and the Cox proportional hazards regression model. Furthermore, the study applied the theory of counting processes and martingales to model the hazard function conditional to covariates using relative risk model and the Aalen additive risk model. This study used data collected at Kigali University Teaching Hospital on 933 diabetic patients admitted or visited the hospital during the period from the 1st January 2008 to the 31st December 2013. The results revealed that the hazard of death from diabetes, for this data, is higher in male patients as compared to female patients; it is higher in older patients compared to relatively younger ones; it is also higher in rural compared to urban patients. Patients treated using placebo had a better survival outcome than those on conventional diabetes medications. Probably, they were much healthier than those on the other three medications. Patients with normal weight, overweight, and obesity were found to have a higher hazard of death from diabetes compared to underweight patients. Patients with type II diabetes had a higher hazard of death as compared to those with type I diabetes. Finally, patients with moderately high to high blood pressure had a higher hazard of death compared to patients with low or normal blood pressure. These results were not found in a single model, but are a summary of findings obtained in several models used...M.Sc. (Mathematical Statistics

    Infant mortality at the Kigali University Teaching Hospital: Application of Aalen additive hazards model and comparison with other classical survival models.

    No full text
    Background:  Beyond the effort provided on the population policy in Rwanda so far, extensive studies on factors that could prevent infant mortality (IM) should be done for more controlling the Infant mortality rate (IMR).  This study presents an application of survival analysis to the infant mortality at the Kigali University Teaching Hospital (KUTH) in Rwanda.Data and methods: The dataset of the KUTH was recorded.  Aalen Additive Hazard Model (AAHM) is used for assessing the relationship between the IM and covariates. The Cox Proportional Hazard Model (CPHM) and the Cox-Aalen Hazard Model (CAHM) are also applied, the results of these three models are compared.Findings: The AAHM distinguishes time dependent and fixed covariates, and this allows an easy interpretation of the results found in CPHM and CAHM.Conclusion: Avoidance of pregnancy until after age 20 and clinically recommended nutrition for the mother during pregnancy would decrease IM

    Cryptocurrencies and Tokens Lifetime Analysis from 2009 to 2021

    No full text
    The success of Bitcoin has spurred emergence of countless alternative coins with some of them shutting down only few weeks after their inception, thus disappearing with millions of dollars collected from enthusiast investors through initial coin offering (ICO) process. This has led investors from the general population to the institutional ones, to become skeptical in venturing in the cryptocurrency market, adding to its highly volatile characteristic. It is then of vital interest to investigate the life span of available coins and tokens, and to evaluate their level of survivability. This will make investors more knowledgeable and hence build their confidence in hazarding in the cryptocurrency market. Survival analysis approach is well suited to provide the needed information. In this study, we discuss the survival outcomes of coins and tokens from the first release of a cryptocurrency in 2009. Non-parametric methods of time-to-event analysis namely Aalen Additive Hazards Model (AAHM) trough counting and martingale processes, Cox Proportional Hazard Model (CPHM) are based on six covariates of interest. Proportional hazards assumption (PHA) is checked by assessing the Kaplan-Meier estimates of survival functions at the levels of each covariate. The results in different regression models display significant and non-significant covariates, relative risks and standard errors. Among the results, it was found that cryptocurrencies under standalone blockchain were at a relatively higher risk of collapsing. It was also found that the 2013–2017 cryptocurrencies release was at a high risk as compared to 2009–2013 release and that cryptocurrencies for which headquarters are known had the relatively better survival outcomes. This provides clear indicators to watch out for while selecting the coins or tokens in which to invest

    Cryptocurrencies and tokens lifetime analysis from 2009 to 2021

    No full text
    The success of Bitcoin has spurred emergence of countless alternative coins with some of them shutting down only few weeks after their inception, thus disappearing with millions of dollars collected from enthusiast investors through initial coin offering (ICO) process. This has led investors from the general population to the institutional ones, to become skeptical in venturing in the cryptocurrency market, adding to its highly volatile characteristic. It is then of vital interest to investigate the life span of available coins and tokens, and to evaluate their level of survivability. This will make investors more knowledgeable and hence build their confidence in hazarding in the cryptocurrency market. Survival analysis approach is well suited to provide the needed information. In this study, we discuss the survival outcomes of coins and tokens from the first release of a cryptocurrency in 2009. Non-parametric methods of time-to-event analysis namely Aalen Additive Hazards Model (AAHM) trough counting and martingale processes, Cox Proportional Hazard Model (CPHM) are based on six covariates of interest. Proportional hazards assumption (PHA) is checked by assessing the Kaplan-Meier estimates of survival functions at the levels of each covariate. The results in different regression models display significant and non-significant covariates, relative risks and standard errors.Among the results, it was found that cryptocurrencies under standalone blockchain were at a relatively higher risk of collapsing. It was also found that the 2013–2017 cryptocurrencies release was at a high risk as compared to 2009–2013 release and that cryptocurrencies for which headquarters are known had the relatively better survival outcomes. This provides clear indicators to watch out for while selecting the coins or tokens in which to invest.https://www.mdpi.com/journal/economiesdm2022Mathematics and Applied Mathematic

    A Novel Epidemic Model for the Interference Spread in the Internet of Things

    No full text
    Due to the multi-technology advancements, internet of things (IoT) applications are in high demand to create smarter environments. Smart objects communicate by exchanging many messages, and this creates interference on receivers. Collection tree algorithms are applied to only reduce the nodes/paths’ interference but cannot fully handle the interference across the underlying IoT. This paper models and analyzes the interference spread in the IoT setting, where the collection tree routing algorithm is adopted. Node interference is treated as a real-life contamination of a disease, where individuals can migrate across compartments such as susceptible, attacked and replaced. The assumed typical collection tree routing model is the least interference beaconing algorithm (LIBA), and the dynamics of the interference spread is studied. The underlying network’s nodes are partitioned into groups of nodes which can affect each other and based on the partition property, the susceptible–attacked–replaced (SAR) model is proposed. To analyze the model, the system stability is studied, and the compartmental based trends are experimented in static, stochastic and predictive systems. The results shows that the dynamics of the system are dependent groups and all have points of convergence for static, stochastic and predictive systems
    corecore