34 research outputs found

    Availability, Affordability, and Pricing of Anti-cancer Medicines in Selected Low and Middle-Income Countries in Africa

    Get PDF
    Introduction: Cancer is a leading cause of morbidity and mortality in Low-and Middle-Income Countries (LMICs). Health outcomes may improve with early detection and treatment. In several African countries including Ghana and South Africa, due to the absence of a clear medicine pricing policy, cancer medicines have high price variations due to forex fluctuations, and import tariffs, which impact access. Aim: This research aimed to assess the availability, affordability, prices, and price components of cancer medicines in South Africa and Ghana. Method: A systematic literature review was undertaken on the availability, pricing, affordability, and access to cancer medicines in LMICs. An adapted World Health Organization (WHO) and Health Action International (HAI) methodology was used to determine the availability, prices, and affordability of cancer medicines in South Africa and Ghana, including a case study to assess the price components in the Ghana distribution chain. Also, affordability according to the impoverishment of the population after procuring cancer medicines in South Africa was determined. Results: The literature review showed that in LMICs, there are wide differences in cancer medicine availability and prices amongst medicine brands in different countries, with low-income earners abandoning treatment because of unaffordability. This research showed similar findings of very low availability of cancer medicines beneath the WHO target of 80%, substantial differences in the prices of different cancer medicine brands due to high markups for both generics and branded medicines in all sectors, originator brands having higher markups than generic products, high medicine prices in private facilities compared to the public facilities and unaffordability of cancer medicines by low-income earners with some impoverished after buying cancer medicines. Conclusion: This research contributes to academic knowledge and the findings can support quality pricing data, comprehensive policies, regulations, and innovative interventions by governments and stakeholders to improve affordable access to cancer medicines

    The differences of individuals with type A and type B behavior patterns and the women\u27s awareness seminar on self-actualization and flexibility

    Get PDF
    The relationship between psychological traits and the incidence of some diseases has captured the attention of researchers in medicine as well as psychology, education, and society. Rosenman and Friedman were pioneers in the discovery that a certain pattern of behaviors may be associated with the risk of heart disease. They designated these psychological factors as Type A behavior pattern and the absence of these factors as Type B behavior pattern.;Several behavioral techniques have been used to modify the Type A behavior pattern. It has been suggested that Type A individuals may need different types of intervention due to the educational, socio-economic or other differences. The techniques of humanistic psychology provide yet another method for changing the Type A behavior pattern.;This study explored the effectiveness of using a model designed to increase the self-actualization and flexibility of subjects in a small group setting. The subjects were classified as Type A, Type B, or indeterminant according to their scores on the Jenkins Activity Survey.;Subjects for the experiment (N = 30) included female community college students enrolled in the Women\u27s Awareness Seminar; the control group (N = 37) was comprised of female community college students enrolled in an orientation class. All experimental subjects received a 30 hour, ten week course designed to increase self-awareness and self-actualization. The model included reading assignments, written homework, lectures, relaxation techniques, and experiential participation in class.;Testing for the effects of the intervention model consisted of pre- posttest administration of the Personal Orientation Inventory (POI) (T(,C) and I scales) and the California Personality Inventory (CPI) F(,x) scale). The Jenkins Activity Survey (JAS) was administered only as a pretest for classification purposes. The T(,C), I, and F(,x) scales were used as change measures for the effectiveness of the treatment; age and pretest scores served as covariates.;Predicted outcomes and results included: (1) Before treatment, individuals in experimental and control groups with Type B behavior pattern will be significantly greater self-actualizers and more flexible than Type A individuals on pretest measures of the POI and CPI. (Rejected). (2) After intervention, individuals in the experimental group will show a greater increase in self-actualization and flexibility than individuals in the control group by comparison of pre- and posttest measures of the POI and CPI. (Rejected). (3) After intervention, Type A\u27s and Type B\u27s in the experimental group will show a significant increase in self-actualization and flexibility as compared to Type A\u27s and B\u27s in the control group. This change will be measured by comparing pre- and posttest scores of the POI and CPI. (Rejected).;The three hypotheses were tested by one-way analysis of variance (T(,C), I, Fx, and JAS scores as dependent variables). Age and pretest scores served as covariates for hypothesis two and three. Hypothesis two was further tested by students\u27 t-test to measure group change. All hypotheses were tested at the .05 level.;Results indicated that the intervention was not effective in changing the self-actualization or flexibility of the experimental subjects. No significant difference was found between Type A\u27s or B\u27s on pretest measures of self-actualization or flexibility. Age did not appear to contribute to the variance

    Investigating the mechanisms of nitrite-mediated cardioprotection

    Get PDF
    Nitrite is able to elicit cardioprotection against myocardial IRI when administered as a preconditioning agent. Activation of the classical NO-sGC-cGMP-PKG pathway under hypoxic and/or acidic conditions has been implicated in this protection but the exact mechanism remains unclear. Herein, we investigated whether nitrite mediates cardioprotection by (1) PKG1α oxidation and/or (2) ALDH2 pathway. Using an isolated Langendorff mouse model of IRI, nitrite (100ΌM) was administered as a preconditioning agent in (1) PKG1α WT and Cys42Ser KI mice or (2) ALDH2 WT and KO mice. We demonstrate that nitrite improves cardiac function (LVDP; p0.05; n=8-12), thus suggesting a dual mechanism involving the classical pathway in the cardiomyocytes. In contrast, we offer evidence to support ALDH2 involvement in nitrite-mediated improvements in CFR (p<0.001; n=7-11) and suggest it may also be involved in cardioprotection. The study provides novel evidence supporting the involvement of both the PKG1α oxidation and ALDH2 pathways in nitrite-mediated effects at the level of the cardiac microvasculature and cardiomyocytes in a murine model of myocardial IRI

    Graphs behind data: A network-based approach to model different scenarios

    Get PDF
    openAl giorno d’oggi, i contesti che possono beneficiare di tecniche di estrazione della conoscenza a partire dai dati grezzi sono aumentati drasticamente. Di conseguenza, la definizione di modelli capaci di rappresentare e gestire dati altamente eterogenei Ăš un argomento di ricerca molto dibattuto in letteratura. In questa tesi, proponiamo una soluzione per affrontare tale problema. In particolare, riteniamo che la teoria dei grafi, e piĂč nello specifico le reti complesse, insieme ai suoi concetti ed approcci, possano rappresentare una valida soluzione. Infatti, noi crediamo che le reti complesse possano costituire un modello unico ed unificante per rappresentare e gestire dati altamente eterogenei. Sulla base di questa premessa, mostriamo come gli stessi concetti ed approcci abbiano la potenzialitĂ  di affrontare con successo molti problemi aperti in diversi contesti. ​Nowadays, the amount and variety of scenarios that can benefit from techniques for extracting and managing knowledge from raw data have dramatically increased. As a result, the search for models capable of ensuring the representation and management of highly heterogeneous data is a hot topic in the data science literature. In this thesis, we aim to propose a solution to address this issue. In particular, we believe that graphs, and more specifically complex networks, as well as the concepts and approaches associated with them, can represent a solution to the problem mentioned above. In fact, we believe that they can be a unique and unifying model to uniformly represent and handle extremely heterogeneous data. Based on this premise, we show how the same concepts and/or approach has the potential to address different open issues in different contexts. ​INGEGNERIA DELL'INFORMAZIONEopenVirgili, Luc

    Application of cardiac imaging modalities in the assessment of coronary artery disease

    Get PDF
    In this thesis, Cardiac MRI and intravascular imaging have been used to address the challenges in coronary artery disease. A novel high-resolution 3-dimensional magnetic resonance imaging method for quantifying infarct size and area at risk in a rodent model was developed and validated. Two contrast agents, microparticles of iron oxide and gadolinium were used to demarcate the area at risk and myocardial infarction. In order to understand the clinical indications for use of Optical Coherence Tomography (OCT), a single centre experience where OCT was used to help guide management in routine percutaneous coronary intervention (PCI) is reported. There is limited data on how well two-dimensional Quantitative Coronary Angiography (2D-QCA) and OCT agree with each other for measurement of coronary artery lumen dimensions. In this comparison study, there was a good correlation and agreement between 2D-QCA and OCT for measurement of proximal and distal reference diameters of a coronary vessel. However, the minimum luminal diameter was underestimated by 2D-QCA. A higher in-hospital and 30 day mortality has been observed in patients who present with a myocardial infarction and have no standard modifiable cardiovascular risk factors (SMuRFs) as compared to patients with one or more SMuRFs. In this Intravascular ultrasound study, The SMuRFless patients had a lower plaque burden and calcification at baseline but the rate of progression of plaque was similar to patients with SMuRFs. In this cardiac MRI study, there was no difference in 30-day mortality however SMuRFless patients had a larger infarct size and a smaller myocardial salvage index. This association was mediated by a larger proportion of left anterior descending artery (LAD) culprit lesions and poor TIMI flow pre-PCI. Interestingly, the proportion of patients with culprit LAD and TIMI 0-1 flow pre-PCI was significantly higher in SMuRFless patients

    Tracing the Compositional Process. Sound art that rewrites its own past: formation, praxis and a computer framework

    Get PDF
    The domain of this thesis is electroacoustic computer-based music and sound art. It investigates a facet of composition which is often neglected or ill-defined: the process of composing itself and its embedding in time. Previous research mostly focused on instrumental composition or, when electronic music was included, the computer was treated as a tool which would eventually be subtracted from the equation. The aim was either to explain a resultant piece of music by reconstructing the intention of the composer, or to explain human creativity by building a model of the mind. Our aim instead is to understand composition as an irreducible unfolding of material traces which takes place in its own temporality. This understanding is formalised as a software framework that traces creation time as a version graph of transactions. The instantiation and manipulation of any musical structure implemented within this framework is thereby automatically stored in a database. Not only can it be queried ex post by an external researcher—providing a new quality for the empirical analysis of the activity of composing—but it is an integral part of the composition environment. Therefore it can recursively become a source for the ongoing composition and introduce new ways of aesthetic expression. The framework aims to unify creation and performance time, fixed and generative composition, human and algorithmic “writing”, a writing that includes indeterminate elements which condense as concurrent vertices in the version graph. The second major contribution is a critical epistemological discourse on the question of ob- servability and the function of observation. Our goal is to explore a new direction of artistic research which is characterised by a mixed methodology of theoretical writing, technological development and artistic practice. The form of the thesis is an exercise in becoming process-like itself, wherein the epistemic thing is generated by translating the gaps between these three levels. This is my idea of the new aesthetics: That through the operation of a re-entry one may establish a sort of process “form”, yielding works which go beyond a categorical either “sound-in-itself” or “conceptualism”. Exemplary processes are revealed by deconstructing a series of existing pieces, as well as through the successful application of the new framework in the creation of new pieces

    4th Annual Computer & Technology Law Institute

    Get PDF
    Materials from the 4th Annual Computer & Technology Law Institute held by UK/CLE in November 2002

    Application of object-orientation to HDL-based designs

    Get PDF
    The increase in the scale of VLSI circuits over the last two decades has been of great importance to the development process. To cope with this ever­growing design complexity. new development techniques and methodologies have been researched and applied. The early 90's have witnessed the uptake of a new kind of design methodology based on Hardware Description Languages (HDL). This methodology has helped to master the possibilities inherent in our ability to manufacture ever-larger designs. However. while HDL based design methodology is sufficient to address today's standard ASIC sizes, it reaches its limits when considering tomorrow's design scales. Already. RISC processor chip descriptions can contain tens of thousands of HDLlines. Object-Oriented design methodology has recently had a considerable Impact in the software design community as it is tightly coupled with the handling of complex systems. Object-Orientation concentrates on data rather than functions since. throughout the design process. data are more stable than functions. Methodologies for both hardware and software have been introduced through the application of HDLs to hardware design. Common design constructs and principles that have proved successful in software language development should therefore be considered in order to assess their suitability for HDLs based designs. A new methodology was created to emphasise on encapsulation. abstraction and classification of designs. using standard VHDL constructs. This achieves higher levels of modelling along with an Improved reusability through design inheritance. The development of extended semantics for integrating Object-Orientation in the VHDL language is described. Comparisons are made between the modelling abilities of the proposed extension and other competing proposals. A UNIX based Object-Oriented to standard VHDL pre-processor is described along with translation techniques and their issues related to synthesis and simulation. This tool permitted validation of the new design methodology by application to existing design problems

    The development of a programme on coping with divorce

    Get PDF
    M.A. (Social Science)The goals of this study were formulated in response to the need for a skills oriented programme to assist individuals to cope with divorce. The aim of this study was to develop a coping with divorce programme, to implement the programme on a trial basis, and to evaluate the programme. The study was undertaken with in the framework of the developmental research and utilization model. The research design integrated exploratory, descriptive and evaluative methods. Qualitative and quantitative methods of data collection were applied. The phenomena of divorce with specific reference to the effects of divorce and aspects central to the phenomena of coping with divorce were identified and discussed. Based on this analysis of divorce, a "Coping with divorce" programme was designed and developed. The Coping with divorce (CWO) programme was implemented and evaluated on a trial basis. The most important findings that were established in this study was that certain aspects of respondents social functioning indicated a statistical significant difference in the pre- and post-test. Furthermore respondents self-esteem, and perception of knowledge and skills regarding coping with divorce indicated a statistical significant difference in the pre- and post-test. The findings of this trial Investigation cannot be generalised and It is recommended that the programme be subjected to more extensive evaluation

    Postcolonialism Cross-Examined

    Get PDF
    Taking a strikingly interdisciplinary and global approach, Postcolonialism Cross-Examined reflects on the current status of postcolonial studies and attempts to break through traditional boundaries, creating a truly comparative and genuinely global phenomenon. Drawing together the field of mainstream postcolonial studies with post-Soviet postcolonial studies and studies of the late Ottoman Empire, the contributors in this volume question many of the concepts and assumptions we have become accustomed to in postcolonial studies, creating a fresh new version of the field. The volume calls the merits of the field into question, investigating how postcolonial studies may have perpetuated and normalized colonialism as an issue exclusive to Western colonial and imperial powers. The volume is the first to open a dialogue between three different areas of postcolonial scholarship that previously developed independently from one another: ‱ the wide field of postcolonial studies working on European colonialism, ‱ the growing field of post-Soviet postcolonial/post-imperial studies, ‱ the still fledgling field of post-Ottoman postcolonial/post-imperial studies, supported by sideways glances at the multidirectional conditions of interaction in East Africa and the East and West Indies. Postcolonialism Cross-Examined looks at topics such as humanism, nationalism, multiculturalism, nostalgia, and the Anthropocene in order to piece together a new, broader vision for postcolonial studies in the twenty-first century. By including territories other than those covered by the postcolonial mainstream, the book strives to reframe the “postcolonial” as a genuinely global phenomenon and develop multidirectional postcolonial perspectives
    corecore