809 research outputs found

    Copyright and E-learning: A Guide for Practitioners

    Get PDF
    No abstract available

    Requirement for the coexpression of T3 and the T cell antigen receptor on a malignant human T cell line.

    Get PDF
    The association between T3 and the T cell antigen receptor was examined using the T3 bearing T cell leukemic line Jurkat. A monoclonal antibody, C305, was produced, which reacted with idiotypic-like determinants expressed on Jurkat. The molecule with which this antibody reacted was a disulfide-linked heterodimer of 90 kD, composed of polypeptides of 42 and 54 kD. Thus, C305 reacted with a molecule with characteristics of the putative T cell antigen receptor described by others. A series of mutants of Jurkat, induced with ethyl methane sulfonate or radiation, was selected for T3 or antigen receptor negativity. In every instance, there was a concomitant loss of both T3 and the antigen receptor as assessed by quantitative absorption, indirect immunofluorescence, and antibody plus complement-mediated cytotoxicity. The absence of antigen receptor molecules was confirmed on diagonal gels, excluding the possibility that conformational changes of the antigen receptor on such T3-negative mutants were responsible for the failure of such mutants to react with C305. Moreover, in a mutant that expressed a marked decrease in the level of T3 expression, there was a comparable decrease in the expression of antigen receptor determinants. These results suggest that there is an obligate requirement for the coexpression of T3 and the T cell antigen receptor. Furthermore, attempts to activate such mutants with the lectin phytohemagglutinin suggested that the expression of T3 and/or the antigen receptor was required for activation of these cells

    Reply to M. Horiguchi et al

    Get PDF
    No abstract available

    Sulphur Equilibrium Between Slags and Iron-Carbon Alloys

    Get PDF
    Abstract Not Provided

    Optimisation of Fluid Properties and Process Parameters for Batch Slot Die Coating of Dilute Polymer Systems

    Get PDF
    The effects of viscosity and surface tension are well documented in the case of roll-to-roll slot die systems. However their influence in combination with process parameters for batch scale coaters are less often discussed. Lack of coating uniformity attained using a bespoke slot die coater at the CPI National Printable Electronics Centre led to the requirement for an investigation to identify the sources of error. This project outlines a method used to determine the effects of material and process parameters on coating quality without the requirement for modification of equipment. It also describes the development of a series of versatile formulations containing polyethylene oxide that can be used to mimic similar polymer formulations for cost-effective initial process development of new materials. Various dilute polymer systems were developed whereby solids loading, viscosity and surface tension were all fully controlled. Design of Experiments techniques were employed to determine relationships between parameters and fluid properties. Chosen parameters included coating velocity, coating acceleration, coating gap (gap between slot die lips and substrate), volume infuse (volume of material initially infused) and flow infuse (flow rate of material initially infused). Their effects on coating thickness and thickness uniformity were quantified for the bulk region of the coating, resulting in the development of a statistical model to aid optimisation of parameters for new materials. Surface tension had a minimal effect on coating thickness and uniformity during the screening study so this was not included in the final model. Coating acceleration and flow infuse exhibited a proportional relationship relating to coating thickness uniformity. Volume infuse did not have a significant effect on coating thickness or uniformity. Particular focus was placed on viscosity and coating velocity; in the case of this bespoke tool they were found to be inversely proportional in relation to coating thickness and uniformity. They strongly influenced the flow rate and the resultant coating thickness due to the effects of fluid resistance, though the wet coat thickness parameter was held at a constant value. This suggested that the system does not work by a feedback loop to correct the flow rate for fluid resistance and internal pressure effects. Regardless of this, an operating window was found in which coating thickness and uniformity can be controlled for polymer solutions with a range of viscosities

    The Melancholy Truth : Corrective and Equitable Justice for Omar Khadr

    Get PDF
    Omar Khadr stands for the melancholy proposition that Canadian courts will recognize a rights violation without demanding an effective remedy. Over the years, Khadr secured many legal remedies, but not the one he sought most: a repatriation order. Why? This paper ventures explanations by viewing the final five Khadr judgments through the lenses of corrective and equitable justice. The final section of the paper recasts the case for the repatriation of Omar Khadr based on two principal arguments. First, a context of structural injustice suggests the application of equitable remedial principles rather than corrective justice, even in the transnational context in which Canada cannot impose structural remedies. Second, the Khadr case suggests that declaratory relief is not an appropriate remedy when delay may cause irreparable harm and where the government may be credibly suspected of bad faith

    The diagnostic performance of routinely acquired and reported computed tomography imaging in patients presenting with suspected pleural malignancy

    Get PDF
    Objectives: Contrast-enhanced computed tomography (CT) provides essential cross-sectional imaging data in patients with suspected pleural malignancy (PM). The performance of CT in routine practice may be lower than in previously reported research. We assessed this relative to ‘real-life’ factors including use of early arterial-phase contrast enhancement (by CT pulmonary angiography (CTPA)) and non-specialist radiology reporting. Materials and methods: Routinely acquired and reported CT scans in patients recruited to the DIAPHRAGM study (a prospective, multi-centre observational study of mesothelioma biomarkers) between January 2014 and April 2016 were retrospectively reviewed. CT reports were classified as malignant if they included specific terms e.g. “suspicious of malignancy”, “stage M1a” and benign if others were used e.g. “indeterminate”, “no cause identified”. All patients followed a standard diagnostic algorithm. The diagnostic performance of CT (overall and based on the above factors) was assessed using 2 × 2 Contingency Tables. Results: 30/345 (9%) eligible patients were excluded (non-contrast (n = 13) or non-contiguous CT (n = 4), incomplete follow-up (n = 13)). 195/315 (62%) patients studied had PM; 90% were cyto-histologically confirmed. 172/315 (55%) presented as an acute admission, of whom 31/172 (18%) had CTPA. Overall, CT sensitivity was 58% (95% CI 51–65%); specificity was 80% (95% CI 72–87%). Sensitivity of CTPA (performed in 31/315 (10%)) was lower (27% (95% CI 9–53%)) than venous-phase CT (61% (95% CI 53–68%) p = 0.0056). Sensitivity of specialist thoracic radiologist reporting was higher (68% (95% CI 55–79%)) than non-specialist reporting (53% (95% CI 44–62%) p = 0.0488). Specificity was not significantly different. Conclusion: The diagnostic performance of CT in routine clinical practice is insufficient to exclude or confirm PM. A benign CT report should not dissuade pleural sampling where the presence of primary or secondary pleural malignancy would alter management. Sensitivity is lower with non-thoracic radiology reporting and particularly low using CTPA

    Fear and loathing in Glasvegas: Contemporary approaches to the copyright clearance headache

    Get PDF
    If we are committed to meeting user expectations by providing the fullest access possible to our collections, how do we overcome our fears as a profession to confidently navigate a copyright landscape of changing legislation, frameworks, regulatory structures and moral imperatives? Our presentation will attempt to answer this question by providing both an example of a single institutional approach and a broader view of the sector's practices, in relation to rights clearance. The Glasgow School of Art has recently completed a project to make its collections accessible online, and in doing so undertook a diligent search exercise before making material available. While copyright legislation is an issue all archivists must contend with, making visual arts archives and collections available online has raised unique moral and legal considerations. This example of institutional best practice, which specifically deals with artistic works, will be contextualised with a broader look at other collection types and approaches to managing the challenges associated with copyright clearance, while making archive collections available online across the UK sector. Emerging principles of good practice in rights management will be illustrated with examples taken from recent case studies with a variety of archive services. Some of the issues this presentation will discuss include: How do we, as archive professionals, react to current copyright framework? What moral implications become apparent during a rights clearance process? How do we overcome our fear of the law, as professionals, and take measured risks in a confident, pragmatic and informed way? How do we fulfil our role as recordkeepers while balancing the access expectations of our users against the rights of copyright holders

    Digitisation Workflow and Guidelines: Digitisation Processes

    Get PDF
    This document outlines the workflow and best practice required to implement the digitisation of physical objects as part of the CHARTER project. It details the CHARTER scanning requirements and parameters for the creation of preservation master files and compressed images for viewing.JIS

    Archives, digitisation and copyright: do archivists in the UK avoid risk through strict compliance with copyright law when they digitise their collections?

    Get PDF
    The duration and complexity of copyright in relation to unpublished materials is contributing to a 20th century ‘black hole’ in the online historical record. Archives collect, preserve, and provide access to records of governments, businesses, communities and individuals: the raw evidence of transactions, activities and events that informs our understanding of the past. The transformative nature of online access to the archival record supports human rights, democracy, openness, transparency, accountability, culture, learning, research and innovation. Despite reform, the legal framework in the UK fails to provide a safe harbor for archives that could make comprehensive online access to the country’s rich and diverse archival holdings possible. This thesis presents the results of a survey of the UK archive sector that explores how copyright affects digitisation of collections, and analyses five digitisation projects at a variety of archive institutions, in order to better understand the decision-making processes and risk management strategies that make archive collections containing third party rights materials available online, despite the tendency towards risk aversion within the archive sector. The thesis found that a small proportion of UK archives have made third-party rights holder material available online, supporting the view that the sector, in general, is risk averse in relation to third-party copyrights. However, evidence gathered suggests that approaches taken by less risk-averse institutions can be adapted to suit the needs of a wide-range of cultural heritage institutions, and best-practice guidance could have a significant impact on online access to 20th century collections. The study contributes baseline data on the sectoral approach to copyright, rights clearance and risk management, and how these approaches affect digitisation, in order to provide a starting point for further research and best-practice guidance for the UK archive sector
    corecore