1,820 research outputs found

    Science, politics, and health in the brave new world of pharmaceutical carcinogenic risk assessment: Technical progress or cycle of regulatory capture?

    Get PDF
    AbstractThe carcinogenicity (cancer-inducing potential) of pharmaceuticals is an important risk factor for health when considering whether thousands of patients on drug trials or millions/billions of consumers in the marketplace should be exposed to a new drug. Drawing on fieldwork involving over 50 interviews and documentary research spanning 2002–2010 in Europe and the US, and on regulatory capture theory, this article investigates how the techno-regulatory standards for carcinogenicity testing of pharmaceuticals have altered since 1998. It focuses on the replacement of long-term carcinogenicity tests in rodents (especially mice) with shorter-term tests involving genetically-engineered mice (GEM). Based on evidence regarding financial/organizational control, methodological design, and interpretation of the validation and application of these new GEM tests, it is argued that regulatory agencies permitted the drug industry to shape such validation and application in ways that prioritized commercial interests over the need to protect public health. Boundary-work enabling industry scientists to define some standards of public-health policy facilitated such capture. However, as the scientific credibility of GEM tests as tools to protect public health by screening out carcinogens became inescapably problematic, a regulatory resurgence, impelled by reputational concerns, exercised more control over industry’s construction and use of the tests, The extensive problems with GEM tests as public-health protective regulatory science raises the spectre that alterations to pharmaceutical carcinogenicity-testing standards since the 1990s may have been boundary-work in which the political project of decreasing the chance that companies’ products are defined as carcinogenic has masqueraded as techno-science

    Maternal determinants and fetal outcome of twin pregnancy: a five-year survey

    Get PDF
    Background: Study prevalence of twin pregnancy, maternal risk factors and fetal outcome in twin pregnancy.Methods: A retrospective study of mothers with twin pregnancies who delivered during the period of 5 years. There were 109 mothers who gave birth to 218 babies. Maternal details, antenatal complications and fetal outcomes were analysed.Results: There were 5432 deliveries which included 109 twin births. Prevalence of twinning was 20/1000 deliveries. The mean age was 28.11 (±SD 4.89) with 69.7% in the younger age groups. No association with parity, BMI and ovulation induction was found. Most common complication was preterm delivery (64.2%) with mean gestational age being 35.07 (±SD 2.32). Others were diabetes (25.7%), hypertension (22.9%), hypothyroidism (14.6%) and postpartum hemorrhage (13.7%). Cesarean section was the commonest mode of delivery (78.0%) with fetal malpresentation (26.6%), fetal distress (20.2%) and hypertension (12.0%) being the commonest indications for termination. Among the hypertensive mothers, 23 delivered by Cesarean and only 2 delivered vaginally which was statistically significant (p- 0.03 OR 5.20). Dichorionicity was commoner than monochorionicity (66.1% vs. 33.9%). Among 218 fetuses delivered, 214 were live births and 4 still born. There were low birth weight Babies (70.6%), normal weight (15.3%), VLBW babies (11.5%) and 2.7% ELBW babies. Fetal complications were IUGR (11.46%), discordant twins (6.8%), congenital anomalies (1.8%), single fetal demise (1.8%) and Intra uterine death of a twin (0.4%). Perinatal mortality rate was 1.65 per thousand births.Conclusions: Prevalence of twin pregnancy was 20/1000 deliveries. Twin pregnancies were seen to be more in the younger age group. Preterm labor, diabetes and hypertension were the main complications with cesarean the most common mode of delivery. Dichorionicity led to less fetal complications and low perinatal mortality

    When civil servants meet consultants: Convergence or conversion in the representations of change?

    Get PDF
    Public Sector Reform has been on the agenda of governments in United Kingdom for many decades. New Public Management is the most recent thinking that is driving the changes in Public Administration since the 1980s. This thesis explores the social psychology of an encounter of Civil Servants and the Consultants, engaged by the government in the late 1990s. This encounter between two fundamentally different groups, that is, the institution of the British Civil Service and the community of practice of the management consultants, resulted in a culture clash of ethos, languages, rites and rituals, perceptions of change, and actions. This is a crucial moment to capture the experience of change and the consequences of these representations in the process. This thesis tracks the social representations of change and the acts of representing the change over a nine-month period. Over 800 staff members from both groups worked intensively together, impacting over 10,000 employees, and documenting this change period. Drawing on a social psychological theory of representations, this thesis looks at these processes both cross-sectionally and longitudinally. Representations are analysed using co-occurrence analysis on the languages used in the documentation about the change (using ALCESTE software). The results of this study looks at the implications of this 'arranged marriage' between two different cultures being put together by a third party, in this case, the government. The study presents evidence of convergence and conversion of representations over time, and offers putative conditions for one or the other to occur. The recommendations made for Private Sector change models to work towards convergence rather than conversion

    Director-Matcher Task Project

    Get PDF
    The Director-Matcher Task Project aims to create a new flexible web toolkit for the collection of data for psychology, linguistic, and cognitive science based research. The toolkit was initially developed to record instances of code switching, a linguistic concept concerned with a speaker alternating between multiple languages during one conversation. The project is based on Toy Task experiments, a modification of the director-matcher task, developed at the University of Bangor’s bilingual research center. Traditionally, two subjects separated by a wall were tasked with matching object placement on a set of chessboards through vocal communication. The Director-Matcher Task Project uses web technologies such as HTML5, JavaScript, and PHP to digitize the concept. The toolkit separates the board and participants physically by providing an online interface for conducting code switching studies, allowing research to be carried out by subjects all over the world without direct physical contact. It also attempts to limit the cognitive bias of participants as much as possible, by removing as many references to a specific language as possible from the toy-task itself, to help foster more pure and natural code switching data. The toolkit further fosters the ability to analyze session voice data and board states to create corpora for the code switching experiments.http://opus.ipfw.edu/stu_symp2014/1020/thumbnail.jp

    Pelvic inflammatory disease and the risk factors

    Get PDF
    Background: Pelvic inflammatory disease is one of the most common gynecological disorders of women. It is a clinical condition where in the endometrial, fallopian tubes and the adjacent pelvic structures are infected due to the ascending infection from the lower genital tract such as vagina and cervix through the uterine cavity leading to severe morbidity.Methods: 150 non-pregnant women who came in with clinical symptoms suggestive of Pelvic inflammatory disease and diagnosed as acute pelvic infection or PID were included in the study. Demographic details such as age, weight, height, parity, socio-economic status, education levels etc were noted.Results: 54% of them belonged to 26-30 years age group, followed by 19.3% of women between 20-25 years. 35.3% of the patients were illiterate followed by primary school education in 29.3%. 74% of the patients belonged to the lower class while 24.7% were from the middle class. Condoms were the most common contraceptive method used in 32% of the cases, while 27.3% of the patients used intrauterine devices.Conclusions: Proper education must be given regarding the hazards of early marriages lack of hygiene, and to abstain from multiple partners, to the women especially those from the lower socioeconomic strata

    Biophysical profile and modified biophysical profile in predicting the fetal outcome

    Get PDF
    Background: Baby’s well-being in utero is often done by using a cardiotocograph (CTG) machine, which assesses the baby’s heart beat pattern as well as the mother’s uterine contractions. However, lowered fetal movements sometimes may be fatal for the baby. Thus, the biophysical and the modified biophysical profile have been introduced.Methods: 242 patients with over 34 weeks of gestation and with one or more risk factors were included in the study. After taking the demographic details, the patients were subjected to detailed physical and clinical evaluation. Modified BPP was done on all the patients. Index of acute fetal hypoxia the NST was done along with the cardiotocograph (CTG). Amniotic fluid volume was calculated.Results: According to the fetal non-stress test, majority of the patients (70.7%) were reactive while 29.3% were non-reactive. Most of the patients had an amniotic fluid index in the normal range i.e. between 8 -<25, 18.6%) had an AFI value of <6 cm while 13.6% had between 6 - <8. Among the babies with reactive NST, non-reactive NST and AFI ≤5, the most common outcome was low birth weight.  APGAR score <7 was observed in 11.1%, 13.1%, 20% among Reactive NST, Non-reactive NST and AFI ≤5 respectively.Conclusions: Present study shows that BPP and MBPP are both comparable to each other, therefore, MBPP, being an easier test can be substituted for BPP

    SEPATH: benchmarking the search for pathogens in human tissue whole genome sequence data leads to template pipelines

    Get PDF
    Background : Human tissue is increasingly being whole genome sequenced as we transition into an era of genomic medicine. With this arises the potential to detect sequences originating from microorganisms, including pathogens amid the plethora of human sequencing reads. In cancer research, the tumorigenic ability of pathogens is being recognized, for example Helicobacter pylori and human papillomavirus in the cases of gastric non-cardia and cervical carcinomas respectively. As of yet, no benchmark has been carried out on the performance of computational approaches for bacterial and viral detection within host-dominated sequence data. Results : We present the results of benchmarking over 70 distinct combinations of tools and parameters on 100 simulated cancer datasets spiked with realistic proportions of bacteria. mOTUs2 and Kraken are the highest performing individual tools achieving median genus level F1-scores of 0.90 and 0.91 respectively. mOTUs2 demonstrates a high performance in estimating bacterial proportions. Employing Kraken on unassembled sequencing reads produces a good but variable performance depending on post-classification filtering parameters. These approaches are investigated on a selection of cervical and gastric cancer whole genome sequences where Alphapapillomavirus and Helicobacter are detected in addition to a variety of other interesting genera. Conclusions : We provide the top performing pipelines from this benchmark in a unifying tool called SEPATH, which is amenable to high throughput sequencing studies across a range of high-performance computing clusters. SEPATH provides a benchmarked and convenient approach to detect pathogens in tissue sequence data helping to determine the relationship between metagenomics and disease

    Modified Semi-Classical Methods for Nonlinear Quantum Oscillations Problems

    Full text link
    We develop a modified semi-classical approach to the approximate solution of Schrodinger's equation for certain nonlinear quantum oscillations problems. At lowest order, the Hamilton-Jacobi equation of the conventional semi-classical formalism is replaced by an inverted-potential-vanishing-energy variant thereof. Under smoothness, convexity and coercivity hypotheses on its potential energy function, we prove, using the calculus of variations together with the Banach space implicit function theorem, the existence of a global, smooth `fundamental solution'. Higher order quantum corrections, for ground and excited states, are computed through the integration of associated systems of linear transport equations, and formal expansions for the corresponding energy eigenvalues obtained by imposing smoothness on the quantum corrections to the eigenfunctions. For linear oscillators our expansions naturally truncate, reproducing the well-known solutions for the energy eigenfunctions and eigenvalues. As an application, we calculate a number of terms in the corresponding expansions for the one-dimensional anharmonic oscillators of quartic, sectic, octic, and dectic types and find that our eigenvalue expansions agree with those of Rayleigh/Schrodinger theory, whereas our wave functions more accurately capture the more-rapid-than-gaussian decay. For the quartic oscillator our results strongly suggest that the ground state energy eigenvalue expansion and its associated wave function expansion are Borel summable to yield natural candidates for the actual exact ground state solution and its energy. Our techniques for proving the existence of the crucial `fundamental solution' to the relevant Hamilton Jacobi equation admit infinite dimensional generalizations. In a parallel project we shall show how this construction can be carried out for the Yang-Mills equations in Minkowski spacetime
    • …
    corecore