462 research outputs found

    Calculations of Radiobiological Treatment Outcome in Rhabdomyosarcoma

    Get PDF
    Thulani Nyathi, Student no: 0413256X, MSc thesis, Physics, Faculty of science. 2006. Supervisor: Prof D van der Merwe.This study aims to calculate tumour control probabilities (TCP) and normal tissue complication probabilities (NTCP) using radiobiological models and correlate these probabilities with clinically observed treatment outcome from follow-up records. These radiobiological calculations were applied retrospectively to thirty-nine paediatric patients who were treated with radiation at Johannesburg Hospital during the period January 1990 to December 2000 and had histologically proven rhabdomyosarcoma. Computer software, BIOPLAN, was used to calculate the TCP and NTCP arising from the dose distribution calculated by the treatment planning system and characterized by dosevolume histograms (DVHs). There was a weak correlation between the calculated TCP and the observed 5-year overall survival status. Furthermore, potential prognostic factors for survival were examined. Statistical analysis was performed using the Cox proportional hazards regression model. The 5-year overall survival rate was 55 %. The findings of this study are a yardstick against which more aggressive radiotherapy fractionation regimes can be compared

    Promoting Effective Teaching in Multi-Grade Teachers

    Get PDF

    Statistical analysis of bioequivalence studies

    Get PDF
    A Research Report submitted to the Faculty of Science in partial fulfilment of the requirements for the degree of Master of Science. 26 October 2016.The cost of healthcare has become generally expensive the world over, of which the greater part of the money is spent buying drugs. In order to reduce the cost of drugs, drug manufacturers came up with the idea of manufacturing generic drugs, which cost less as compared to brand name drugs. The challenge which arose was how safe, effective and efficient the generic drugs are compared to the brand name drugs, if people were to buy them. As a consequence of this challenge, bioequivalence studies evolved, being statistical procedures for comparing whether the generic and brand name drugs are similar in treating patients for various diseases. This study was undertaken to show the existence of bioequivalence in drugs. Bioavailability is considered in generic drugs to ensure that it is more or less the same as that of the original drugs by using statistical tests. The United States of America’s Food and Agricultural Department took a lead in the research on coming up with statistical methods for certifying generic drugs as bioequivalent to brand name drugs. Pharmacokinetic parameters are obtained from blood samples after dosing study subjects with generic and brand name drugs. The design for analysis in this research report will be a 2 2 crossover design. Average, population and individual bioequivalence is checked from pharmacokinetic parameters to ascertain as to whether drugs are bioequivalent or not. Statistical procedures used include confidence intervals, interval hypothesis tests using parametric as well as nonparametric statistical methods. On presenting results to conclude that drugs are bioequivalent or not, in addition to hypothesis tests and confidence intervals, which indicates whether there is a difference or not, effect sizes will also be reported. If ever there is a difference between generic and brand name drugs, effect sizes then quantify the magnitude of the difference. KEY WORDS: bioequivalence, bioavailability, generic (test) drugs, brand name (reference) drugs, average bioequivalence, population bioequivalence, individual bioequivalence, pharmacokinetic parameters, therapeutic window, pharmaceutical equivalence, confidence intervals, hypothesis tests, effect sizes.TG201

    A cost benefit analysis of operational risk quantification methods for regulatory capital

    Get PDF
    Operational risk has attracted a sizeable amount of attention in recent years as a result of massive operational losses that headlined financial markets across the world. The operational risk losses have been on the back of litigation cases and regulatory fines, some of which originated from the 2008 global financial crisis. As a result it is compulsory for financial institutions to reserve capital for the operational risk exposures inherent in their business activities. Local financial institutions are free to use any of the following operational risk capital estimation methods: Advanced Measurement Approach (AMA), the Standardized (TSA) and/ the Basic Indicator Approach (BIA). The BIA and TSA are predetermined by the Reserve Bank, whilst AMA relies on internally generated methodologies. Estimation approaches employed in this study were initially introduced by the BCBS, largely premised on an increasingly sophisticated technique to incentivise banks to continually advance their management and measurement methods while benefiting from a lower capital charge through gradating from the least to the most sophisticated measurement tool. However, in contrast to BCBS's premise, Sundmacher (2007), whilst using a hypothetical example, finds that depending on a financial institution's distribution of its Gross Income, the incentive to move from BIA to TSA is nonexistent or marginal at best. In this thesis I extend Sundmacher (2007)'s work, and I test one instance of AMA regulatory capital (RegCap) against that of TSA in a bid to crystalise the rand benefit that financial institutions stand to attain (if at all) should they move from TSA to AMA. A Loss Distribution Approach (LDA), coupled with a Monte Carlo simulation, were used in modelling AMA. In modelling the loss severities, the Lognormal, Weibull, Burr, Generalized Pareto, Pareto and Gamma distributions were considered, whilst the Poisson distribution was used for modelling operational loss frequency. The Kolmogorov-Smirnov and Akaike information criterion tests were respectively used for assessing the level of distribution fit and for model selection. The robustness and stability of the model were gauged using stress testing and bootstrap. The TSA modelling design involved using predetermined beta values for different business lines specified by the BCBS. The findings show that the Lognormal and Burr distributions best describes the empirical data. Additionally, there is a substantial incentive in terms of the rand benefit of migrating from TSA to AMA in estimating operational risk capital. The initial benefit could be directed towards changes in information technology systems in order to effect the change from TSA to AMA. Notwithstanding that the data set used in this thesis is restricted to just one of the "big four banks" (owing to proprietary restrictions), the methodology is representable (or generalisable) to the other big banks within South Africa. The scope of this study can further be extended to cover Extreme Value Theory, Non-Parametric Empirical Sampling, Markov Chain Monte Carlo, and Bayesian Approaches in estimating operational risk capital

    Analysis of operational risk in the South African banking sector using the standardised measurement approach

    Get PDF
    Abstract : Over the last decade, financial markets across the world have been devastated by operational risk-related incidents. These incidents were caused by a number of aspects, such as, inter alia, fraud, improper business practices, natural disasters, and technology failures. As new losses are incurred, they become part of each financial institution’s internal loss database. The inclusion of these losses has caused notable upward spikes in the operational risk Pillar I regulatory capital charge for financial institutions across the board. The inherent imperfections in people, processes, and systems–be it by intention or oversight–are exposures that cannot be entirely eliminated from bank operations. Thus, the South African Reserve Bank mandates South African financial institutions to reserve capital to cover their idiosyncratic operational risk exposures. Investors fund capital reserves that are held by financial institutions, and these stakeholders demand a viable return on their investment. Consequently, the risk exposure and capital held relationship should be fully understood, managed, and optimised. This thesis extends Sundmacher (2007)’s work through the use of one instance of the Standardised Measurement Approach data against that of the Advanced Measurement Approach, the Standardised Approach, and the Basic Indicator Approach to estimate the potential financial benefit that financial institutions in South Africa could attain or lose, should they move from a Basic Indicator Approach to a Standardised Approach, or from a Standardised Approach to an Advanced Measurement Approach, or from an Advanced Measurement Approach to a Standardised Measurement Approach. The Advanced Measurement Approach, a Loss Distribution Approach coupled with a Monte Carlo simulation was used. Parametric models were imposed to generate the annual loss distribution through the convolution of the annual loss severity and frequency distribution. To fit the internal loss data for each class, the mean annual number of losses was calculated and was assumed to follow a Poisson distribution. The Maximum Likelihood Estimator was used to fit four severity distributions: Lognormal;Weibull; Generalized Pareto; and Burr distributions. To determine the goodness of fit, the Kolmogorov-Smirnov Test at a 5% level of significance was used. To select the best fitting distribution, the Akaike Information Criterion was used. Robustness and stability tests where then performed, using bootstrapping and stress-testing respectively. Overall, we find that the Basel Committee on Banking Supervision’s primary consideration that postulates that there is value in a financial institution moving from the Basic Indicator Approach to the Standardised Approach, or from the Standardised Approach to the Advanced Measurement Approach is indeed valid, but fails in the movement from an Advanced Measurement Approach to a Standardised Measurement Approach. The best Pillar I Capital reprieve is offered by the Diversified Advanced Measurement Approach, whilst the second best is the Standardised Measurement Approach based on an average total loss threshold of €100k (0.87% higher than the Diversified Advanced Measurement Approach), closely followed by the default Standardised Measurement Approach based on average total loss threshold of €20k (5.63% higher than the Diversified Advanced Measurement Approach). To the best of our abilities, we could not find any work that is comprehensive enough to include all four available operational risk quantification approaches (Basic Indicator Approach, Standardised Approach, Advanced Measurement Approach, and Standardised Measurement Approach), for the South African market in particular. This work foresees South African financial institutions pushing back on the implementation of SMA, and potentially lobbying the regulator to remain in AMA – as the alternative might mean increased capital requirements leading to reduced Economic Value Added to shareholders (as more capital is required at the same level of profitability or business activity). The financial institutions are anticipated to sight advanced modelling techniques as helping management have a deeper understanding of their exposures – whilst the Scenario Analysis process allows them a method of identifying their key risks and quantifying them (adding to management’s tools set). However, if South African financial institutions want to compete at a global stage and wanted to be accepted among ‘internationally active’ institutions – their adoption of SMA may not be a choice but an obligation and an entry ticket to the game (global trade).M.Com. (Financial Economics

    SINGLE AVIATION MARKETS AND CONTESTABILITY THEORY: GETTING THE POLICY BEARINGS RIGHT

    Get PDF
    This paper looks at the theory of contestable markets as it relates to the aviation industry, particularly its contribution to the deregulation debate and subsequent extension to the evaluation of welfare benefits in single aviation markets. The conclusion reached is that the theory has limited usefulness as far as policy formulation in aviation markets is concerned

    Factors predisposing never-married women to have children in Namibia

    Get PDF
    A RESEARCH REPORT SUBMITTED TO THE FACULTY OF HUMANITIES, SCHOOL OF SOCIAL SCIENCES, UNIVERSITY OF WITWATERSRAND, JOHANNESBURG, IN THE PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS IN THE FIELD OF DEMOGRAPHY AND POPULATION STUDIES 17 September 2015Context: Generally, marriage has been early and almost universal phenomenon in Sub-Saharan Africa, and this can be seen as an important factor in determining fertility. However, fertility among never-married women is no longer negligible. Non-marital childbearing has increased, as women spend much of their reproductive lives unmarried, while remaining sexually active. Although a number of studies have examined non-marital childbearing, the exploration has been largely on teenagers and adolescent youths. The purpose of this study is to identify the factors predisposing never-married women aged 25−49 to have children. Methodology: This study was a secondary data analysis of the 2006-07 Namibian Demographic and Health Survey data. The study population was never-married women, aged 25−49, with a total weighted sample of 2,121. The dependent variable was never-married fertility, categorised into women who have had no birth and those who have had at least one birth. Age specific fertility rates were calculated using the TFR2 module. Bivariate and multivariate binomial logistic techniques were used to examine the association between independent variables of interest and never-married women’s childbearing experience. Results: The study showed that 79% of never-married women, aged 25−49, had at least one child. Respondents from poor households, less educated respondents, rural dwellers and women from the Herero ethno-linguistic group, were more likely to be never-married mothers. The odds of being a never-married mother increased with age. The results further showed that the likelihood for being a never-married mother was higher among those women who reported ever having used contraception. Furthermore, the results showed that delaying age at sexual debut decreases the probability of being a never-married mother. Conclusion: Childbearing among never-married women is common in Namibia, and with increasing age, the risk of having children outside of marriage increases. The consequences of never-married women’s childbearing should be studied, with a focus on the factors identified to influence their childbearing. Furthermore, policies and programmes addressing never-married women’s fertility should reflect the factors associated with never-married women’s fertility in a context where marriage levels are decreasing and fertility is happening outside of marriage
    • …
    corecore