871 research outputs found

    Identities for the gamma and hypergeometric functions: an overview from Euler to the present

    Get PDF
    A research report submitted to the Faculty of Science, University of the Witwatersrand, in fulfilment of the requirements for the degree of Master of Science. Johannesburg, 2013Equations involving the gamma and hypergeometric functions are of great interest to mathematicians and scientists, and newly proven identities for these functions assist in finding solutions to differential and integral equations. In this work we trace a brief history of the development of the gamma and hypergeometric functions, illustrate the close relationship between them and present a range of their most useful properties and identities, from the earliest ones to those developed in more recent years. Our literature review will show that while continued research into hypergeometric identities has generated many new results, some of these can be shown to be variations of known identities. Hence, we will also discuss computer based methods that have been developed for creating and analysing such identities, in order to check for originality and for numerical validity

    Contributions to imaging.

    Get PDF
    Four topics are considered - each associated with a different aspect of imaging. Using X-ray diffraction it is possible to say much about the structure of molecules. Two models for the D.N.A. molecule are analysed with respect to the diffraction data. The models are Watson and Crick's "Double Helix" and Rodley's, recently developed, "Side-by-Side". It is demonstrated that the side-by-side is a viable alternative model for D.N.A. However the low quality of the data precludes a definitive decision as to the actual structure of the molecule. Conventional X-ray computed tomography body scanners, while producing impressive results when imaging stationary objects, can not image rapidly moving organs such as the beating heart. As the heart motion is periodic, it has been suggested that stroboscopic techniques be employed. However the resulting imaging quality is poor when standard image reconstruction methods are used. By taking account of the fact that the region surrounding the heart is stationary, though, a significant improvement in image quality can be obtained. A simple procedure for achieving this improvement is presented here. Ultrasonic transmission tomography is more complicated than the X-ray case because ultrasonic rays, unlike X-rays, are diffracted as they pass through a body. Therefore they are generally curved. It is shown how ray curvature makes it impossible to image certain types of objects exactly. Nevertheless it seems that useful results can be obtained by treating the rays as being straight, and using X-ray computed tomography image reconstruction algorithms. Imaging using electric currents is examined. The types of independent measurements that can be made are discussed. The imaging problem is far from. trivial and, in the general case, largely unsolved. Here a method for uniquely imaging circularly symmetric conductivity distributions is outlined

    Constructing curvature: the iterative design of a computer-based microworld for non-Euclidian geometry.

    Get PDF
    The study charts the iterative development of a computer-based microworld for noneuclidean\ud geometry. Its aim was to explore the possibilities for constructing a suitable\ud context that simultaneously articulated the processes of teaching and learning using\ud computer-based versions of euclidean models for non-euclidean geometry, and the\ud construction of the context.\ud Using the microworld paradigm as the basis for a model of a computer-based learning\ud environment, the study defines a microworld not only in terms of the computational and\ud non-computational tools available to the learner, but also with reference to its pedagogical\ud intentions and cognitive pre-suppositions. The model of the microworld that was created\ud was then used to guide its design and development. The computational element of the\ud microworld employed an object-oriented version of the Lisp-based programming\ud language Logo to implement Turtle Graphics in a non-euclidean context.\ud The design process for the microworld was iterative. Activities, which brought together\ud software and specific pedagogic approaches to non-euclidean geometry, were trialled and\ud modified in the light of learners' experiences with the microworld. Organised into three\ud developmental cycles, the study describes and analyses each iteration under three\ud interrelated categories: technical refinement of the software and non-computational\ud objects, structuring of the pedagogical framework, and the cognitive development of the\ud learners mediated by their experience of the microworld.\ud The study concludes with an appreciation of this iterative development process. It\ud proposes a framework for microworld creation based on the principles of design and of\ud learning as the exploration of a knowledge domain

    Corporate Social Capital and Firm Performance in the Global Information Technology Services Sector

    Get PDF
    The confluence of a number of marketplace phenomena has provided the impetus for the selection and conduct of this research. The first is the so called value relevance of intangibles in determining share market performance of publicly listed companies. The growing gap between market and book values has been proposed as an indication of the impact of intangibles on share price values. A second related phenomenon is the increasing reliance on share price appreciation as the principal means for shareholder return as opposed to returns through dividends. This suggests that share prices are becoming an even more critical firm performance measure than traditional accounting-based firm performance measures like return on investment (ROI). A third phenomenon is the rapid growth in marketplace alliances and joint ventures, the number of which has grown rapidly over the past 30 years. The explanation for these phenomena may lie in the concept of corporate social capital (CSC) which, as an intangible asset (IA), has been proposed in several normative studies. CSC has been defined as “the set of resources, tangible or virtual, that accrue to a corporate player through the player’s social relationships, facilitating the attainment of goals” (Leenders & Gabbay, 1999, p3). However, constructs for CSC have only been loosely defined and its impacts on firm performance only minimally empirically tested. This research addresses this gap in the literature. The key aim of this research is to explore the impact of CSC on firm performance. Through the use of CSC as a lens for viewing a firm’s intangibles, several important sub-components of the CSC formulation are exposed. These include a firm’s market centrality (CENT), absorptive capacity (AC), internal capital (INC), human capital (HC) and financial soundness. Therefore, an extended aim for this research is to identify the differential impacts of the CSC sub-components on firm performance. Firm performance was measured as ROI, market-to-book ratios (Tobin’s Q) and total shareholder return (TSR). Overall, the research results indicate that CSC is a significant predictor of firm performance, but falls short of fully explaining the market-to-book value disparity. For this research an innovative computer-supported content analysis (CA) technique was devised to capture a majority of the data required for the empirical research. The use of a commercial news aggregation service, Factiva, and a standard taxonomy of terms for the search, allowed variables for intangible constructs to be derived from a relatively large sample of firms (n=155) from the global information technology services (ITS) sector from 2001 to 2004. Data indices for joint venture or alliance activity, research and development (R&D) activity, HC, INC and external capital (EC) were all developed using this CA approach. The research findings indicated that all things aren’t equal in terms of how the benefits of CSC accrue to different firms in the sector. The research indicated that for larger, more mature firms, financial soundness does not necessarily correlate with improved shareholder return. The inference is that these firms may have reached a plateau in terms of how the market is valuing them. In terms of market centrality, the research indicates that software firms could benefit from building a larger number of alliances and becoming more centrally connected in the marketplace. The reverse is true, however, for larger, more established firms in the non-software sectors. These companies can be penalised for being over-connected, potentially signalling that they are locked into a suite of alliances that will ultimately limit their capacity to innovate and grow. For smaller, potentially loss-making firms, the research indicates that investments in HC are potentially the only investment strategy that could result in improvements in profitability and shareholder return. Investments by such firms in R&D or INC developments are likely to depress shareholder value and therefore should be minimised in favour of HC investments. For larger, more established firms, investment in HC is beneficial for both ROI and TSR. Investments in areas like R&D and INC were found to be only beneficial to those firms who have the financial capacity to afford it. Firms that don’t appear to have the financial resources to support the level of investments they are making in R&D and/or INC were penalised by the market. Overall, the research provides specific insights into the links between firms and their performance, through appropriate investments in CSC. In terms of research practice, this research demonstrates the viability of computer-supported CA. Progress in the development of more intelligent search technologies will provide increasing utility to CA researchers, promising to unlock a vast range of textual source data for researchers that were previously beyond manual CA practices

    Developing and evaluating packages to support implementation of quality indicators in general practice : the ASPIRE research programme, including two cluster RCTs

    Get PDF
    This is the final version. Available from the NIHR Journals Library via the DOI in this recordData-sharing statement: All data requests should be submitted to the corresponding author for consideration. Access to anonymised data may be granted following review.Background Dissemination of clinical guidelines is necessary but seldom sufficient by itself to ensure the reliable uptake of evidence-based practice. There are further challenges in implementing multiple clinical guidelines and clinical practice recommendations in the pressurised environment of general practice. Objectives We aimed to develop and evaluate an implementation package that could be adapted to support the uptake of a range of clinical guideline recommendations and be sustainably integrated within general practice systems and resources. Over five linked work packages, we developed ‘high-impact’ quality indicators to show where a measurable change in clinical practice can improve patient outcomes (work package 1), analysed adherence to selected indicators (work package 2), developed an adaptable implementation package (work package 3), evaluated the effects and cost-effectiveness of adapted implementation packages targeting four indicators (work package 4) and examined intervention fidelity and mechanisms of action (work package 5). Setting and participants Health-care professionals and patients from general practices in West Yorkshire, UK. Design We reviewed recommendations from existing National Institute for Health and Care Excellence clinical guidance and used a multistage consensus process, including 11 professionals and patients, to derive a set of ‘high-impact’ evidence-based indicators that could be measured using routinely collected data (work package 1). In 89 general practices that shared data, we found marked variations and scope for improvement in adherence to several indicators (work package 2). Interviews with 60 general practitioners, practice nurses and practice managers explored perceived determinants of adherence to selected indicators and suggested the feasibility of adapting an implementation package to target different indicators (work package 3). We worked with professional and patient panels to develop four adapted implementation packages. These targeted risky prescribing involving non-steroidal anti-inflammatory and antiplatelet drugs, type 2 diabetes control, blood pressure control and anticoagulation for atrial fibrillation. The implementation packages embedded behaviour change techniques within audit and feedback, educational outreach and (for risky prescribing) computerised prompts. We randomised 178 practices to implementation packages targeting either diabetes control or risky prescribing (trial 1), or blood pressure control or anticoagulation (trial 2), or to a further control (non-intervention) group, and undertook economic modelling (work package 4). In trials 1 and 2, practices randomised to the implementation package for one indicator acted as control practices for the other package, and vice versa. A parallel process evaluation included a further eight practices (work package 5). Main outcome measures Trial primary end points at 11 months comprised achievement of all recommended levels of glycated haemoglobin, blood pressure and cholesterol; risky prescribing levels; achievement of recommended blood pressure; and anticoagulation prescribing. Results We recruited 178 (73%) out of 243 eligible general practices. We randomised 80 practices to trial 1 (40 per arm) and 64 to trial 2 (32 per arm), with 34 non-intervention controls. The risky prescribing implementation package reduced risky prescribing (odds ratio 0.82, 97.5% confidence interval 0.67 to 0.99; p = 0.017) with an incremental cost-effectiveness ratio of £2337 per quality-adjusted life-year. The other three packages had no effect on primary end points. The process evaluation suggested that trial outcomes were influenced by losses in fidelity throughout intervention delivery and enactment, and by the nature of the targeted clinical and patient behaviours. Limitations Our programme was conducted in one geographical area; however, practice and patient population characteristics are otherwise likely to be sufficiently diverse and typical to enhance generalisability to the UK. We used an ‘opt-out’ approach to recruit general practices to the randomised trials. Subsequently, our trial practices may have engaged with the implementation package less than if they had actively volunteered. However, this approach increases confidence in the wider applicability of trial findings as it replicates guideline implementation activities under standard conditions. Conclusions This pragmatic, rigorous evaluation indicates the value of an implementation package targeting risky prescribing. In broad terms, an adapted ‘one-size-fits-all’ approach did not consistently work, with no improvement for other targeted indicators. Future work There are challenges in designing ‘one-size-fits-all’ implementation strategies that are sufficiently robust to bring about change in the face of difficult clinical contexts and fidelity losses. We recommend maximising feasibility and ‘stress testing’ prior to rolling out interventions within a definitive evaluation. Our programme has led on to other work, adapting audit and feedback for other priorities and evaluating different ways of delivering feedback to improve patient care.National Institute for Health Research (NIHR

    Governing through the climate: climate change, the anthropocene, and global governmentality

    Get PDF
    The concept of anthropogenic climate change is now understood in the discipline of International Relations (IR) as an urgent environmental problem enveloping the globe. It underlies recent claims that humanity’s impact on the Earth’s natural systems is so consequential that a new geologic epoch has begun: The Anthropocene, or the ‘human age’. Yet, IR’s increasing engagement and use of these scientific concepts raises significant questions the discipline has yet to address. For instance, if global climate change appeared in international politics only as recently as the late-1980s, what spurred this sudden emergence? If the Anthropocene appeared only after 2000, then how does this new concept affect the way we now think about global politics, the Earth, and even ourselves? This thesis answers these questions by arguing that the concepts of global climate change and the Anthropocene are neither immutable nor universal scientific truths or natural objects. Rather, they emerged when technological advances in nuclear physics and models tracing bomb radiocarbon intersected with the ways states govern their territories and subjects. The global nature or ‘climatic globality’ of these concepts, therefore, is a manner of conducting and steering human conduct and action by establishing the boundaries of subjectivity when they are thought. This is what Michel Foucault called governmentality. It is demonstrated in this thesis through a genealogical tracing of climate change in IR, focusing on how nuclear sciences, computational modelling technologies, and regimes of international governance, overlapped to form the climatic globality IR now takes for granted. Combining genealogy with the philosophies of Martin Heidegger and Hannah Arendt, a new form of global governmentality becomes evident. Through a technological and metaphysical subjectivism with the carbon atom as its substrate, the human self now asserts itself from atomic to global scales, as the maker, master, and steward, of the Earth

    The Role of Organisational Culture on Cognitive Learning Styles in Libyan Universities

    Get PDF
    The main aim of the study is to explore the potential role of organisational culture on learning styles in Libyan Universities. In so doing the research has embarked on a search for a suitable literature relating to both the learning styles and organisational culture. The study has learnt that cognitive learning styles should be treated as the process of mental activities, learning and problem solving and being independent of subject content; and are perceptual, intellectual, personality and social domains; and tend to remain unchanged over a long period of time. Nevertheless, as reported, recently, in the area of neuropsychology the assumption of fixed personality has been relaxed, so that individual’s personality would be changed over time and under different environments or situations. This has led the research to focus, inter alia, more profoundly on two main constructs: personal learning environment (PLE) and personal learning styles pedagogy (PLSP). The relevant methodology has been found to be a mixed approach based on a survey, consisting of a structured questionnaire and semi-structured interviews. In order to satisfy the statistical properties, the sample size for each and every university was set at 300 students, for which the response rates varied between 66% and 70%. On the whole, as argued earlier, the final sample for each university turned out to be sufficient for consistency and reliability of the inferred statistics. Interviews of teaching staff in each and every university were conducted in support of the findings from the student questionnaire. On the whole, the results appeared to be conclusive in terms of satisfying our initial aims and questions of the study. Following a series of statistical testing and analysis, primarily using Structural Equations Models, the findings suggest that the verbaliser-imager tends to be a more common style of learning amongst students in these universities. The findings from teaching staff interviews revealed universities lack of support through provision of resources and funds for any new and innovative teaching developments. It has appeared that the universities, on the whole, have miserably failed to promote any innovative teaching and have denied their students of quality teaching and learning styles. In short, the findings from the interviews suggest that the entire Higher Education system in Libya has under-performed for many years in the two most important aspects of education quality: innovative teaching and promotion of cognitive learning styles

    How do different aspects of spatial skills relate to early arithmetic and number line estimation?

    Full text link
    corecore