662 research outputs found

    Towards the 3D Web with Open Simulator

    Get PDF
    Continuing advances and reduced costs in computational power, graphics processors and network bandwidth have led to 3D immersive multi-user virtual worlds becoming increasingly accessible while offering an improved and engaging Quality of Experience. At the same time the functionality of the World Wide Web continues to expand alongside the computing infrastructure it runs on and pages can now routinely accommodate many forms of interactive multimedia components as standard features - streaming video for example. Inevitably there is an emerging expectation that the Web will expand further to incorporate immersive 3D environments. This is exciting because humans are well adapted to operating in 3D environments and it is challenging because existing software and skill sets are focused around competencies in 2D Web applications. Open Simulator (OpenSim) is a freely available open source tool-kit that empowers users to create and deploy their own 3D environments in the same way that anyone can create and deploy a Web site. Its characteristics can be seen as a set of references as to how the 3D Web could be instantiated. This paper describes experiments carried out with OpenSim to better understand network and system issues, and presents experience in using OpenSim to develop and deliver applications for education and cultural heritage. Evaluation is based upon observations of these applications in use and measurements of systems both in the lab and in the wild

    Curated Databases in the Life Sciences: The Edinburgh Mouse Atlas Project

    Get PDF
    This case study scopes and assesses the data curation aspects of the Edinburgh Mouse Atlas Project (EMAP), a programme funded by the Medical Research Council (MRC). The principal goal for EMAP is to develop an expression summary for each gene in the mouse embryo, which collectively has been named the Edinburgh Mouse Atlas Gene-Expression Database (EMAGE)

    Confronting unemployment in a street-level bureaucracy : jobcentre staff and client perspectives

    Get PDF
    This thesis presents an account of the roles played by social actors in the implementation of unemployment policy in the UK. Lipsky’s (1980) theory of street-level bureaucracy has been adopted, updated to the contemporary context of the managerial state (Clarke ;Newman, 1997) and developed in the specific case of the Jobcentre. The analysis is based on data collected during an ethnographic investigation of one case study Jobcentre office in Central Scotland. The methods consisted of six months of direct observation, interviews with 48 members of Jobcentre staff, semi-structured interviews with 35 users and analysis of notified vacancies and guidance documents. The argument is that front-line workers re-create policy as they implement it. They do so in reaction to a series of influences, constraints and incentives. Users therefore receive a service that is a modified version of the official policy. Users do not necessarily accept the policy that they are subjected to. They do not identify with the new managerialist notion of customer service because as benefit recipients they are denied purchasing power, choice and power. Unemployment policy is not delivered uniformly or unilaterally because front-line staff are active in developing work habits that influence the outcomes of policy. Policy is accomplished by staff in practice by categorising users into client types. This is significant because staff represent the state to the citizen in their interaction. Users are also active in accomplishing policy, whether they conform with, contest, negotiate or co-produce policy. Understanding what unemployment policy actually is, and what it means to people, depends on understanding these social processes by which policy emerges in practice.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Understanding Views Around the Creation of a Consented, Donated Databank of Clinical Free Text to Develop and Train Natural Language Processing Models for Research: Focus Group Interviews With Stakeholders

    Get PDF
    BACKGROUND: Information stored within electronic health records is often recorded as unstructured text. Special computerized natural language processing (NLP) tools are needed to process this text; however, complex governance arrangements make such data in the National Health Service hard to access, and therefore, it is difficult to use for research in improving NLP methods. The creation of a donated databank of clinical free text could provide an important opportunity for researchers to develop NLP methods and tools and may circumvent delays in accessing the data needed to train the models. However, to date, there has been little or no engagement with stakeholders on the acceptability and design considerations of establishing a free-text databank for this purpose. OBJECTIVE: This study aimed to ascertain stakeholder views around the creation of a consented, donated databank of clinical free text to help create, train, and evaluate NLP for clinical research and to inform the potential next steps for adopting a partner-led approach to establish a national, funded databank of free text for use by the research community. METHODS: Web-based in-depth focus group interviews were conducted with 4 stakeholder groups (patients and members of the public, clinicians, information governance leads and research ethics members, and NLP researchers). RESULTS: All stakeholder groups were strongly in favor of the databank and saw great value in creating an environment where NLP tools can be tested and trained to improve their accuracy. Participants highlighted a range of complex issues for consideration as the databank is developed, including communicating the intended purpose, the approach to access and safeguarding the data, who should have access, and how to fund the databank. Participants recommended that a small-scale, gradual approach be adopted to start to gather donations and encouraged further engagement with stakeholders to develop a road map and set of standards for the databank. CONCLUSIONS: These findings provide a clear mandate to begin developing the databank and a framework for stakeholder expectations, which we would aim to meet with the databank delivery

    Immersive Installation: “A Virtual St Kilda”

    Get PDF
    This paper discusses a Virtual Histories project, which developed a digital reconstruction of St Kilda. St Kilda is the most remote and western part of the United Kingdom. It was evacuated in the 1930s and lay empty for several decades. It is a world heritage site for both built and natural environment . The Virtual St Kilda acted as a focus for the collection and presentation of tangible and intangible cultural heritage. It was on show as an exhibition in the Taigh Chearsabah museum (Figure 5) located in North Uist Scotland. The exhibition is built around the OpenSimulator Open VirtualWorld server using commodity hardware. The simulation covers some 4 square km of virtual space, and models both tangible and intangible culture. It is integrated into an exhibition and articulates an interpretation of the St Kilda legacy through the prism of contemporary North Uist life.Postprin

    Improved quantification of HIV-1 infected CD4 + T cells using an optimised method of intracellular HIV-1 gag p24 antigen detection

    Full text link
    The capacity of CD8+ T cells to inhibit HIV-1 replication in vitro strongly correlates with virus control in vivo. Post-hoc evaluations of HIV-1 vaccine candidates suggest that this immunological parameter is a promising benchmark of vaccine efficacy. Large-scale analysis of CD8+ T cell antiviral activity requires a rapid, robust and economical assay for accurate quantification of HIV-1 infection in primary CD4+ T cells. Detection of intracellular HIV-1 p24 antigen (p24 Ag) by flow cytometry is one such method but it is thought to be less sensitive and quantitative than p24 Ag ELISA. We report that fixation and permeabilisation of HIV-infected cells using paraformaldehyde/50% methanol/Nonidet P-40 instead of a conventional paraformaldehyde/saponin-based protocol improved their detection across multiplicities of infection (MOI) ranging from 10-2 to 8×10-5, and by nearly two-fold (p<0.001) at the optimal MOI tested (10-2). The frequency of infected cells was strongly correlated with p24 Ag release during culture, thus validating its use as a measure of productive infection. We were also able to quantify infection with a panel of HIV-1 isolates representing the major clades. The protocol described here is rapid and cost-effective compared with ELISA and thus could be a useful component of immune monitoring of HIV-1 vaccines and interventions to reduce viral reservoirs. © 2013 Elsevier B.V

    In Situ Recrystallization of Co-Evaporated Cu(In,Ga)Se\u3csub\u3e2\u3c/sub\u3e Thin Films by Copper Chloride Vapor Treatment Towards Solar Cell Applications

    Get PDF
    Cu(In,Ga)Se2 (or CIGS) thin films and devices were fabricated using a modified three-stage process. Using high deposition rates and a low temperature during the process, a copper chloride vapor treatment was introduced in between the second and third stages to enhance the films properties. X-ray diffraction and scanning electron microscopy demonstrate that drastic changes occur after this recrystallization process, yielding films with much larger grains. Secondary ion mass spectrometry shows that the depth profile of many elements is not modified (such as Cu, In and Se) while others change dramatically (such as Ga and Na). Because of the competing effects of these changes, not all parameters of the solar cells are enhanced, yielding an increase of 15% in the device efficiency at the most
    corecore