28,878 research outputs found

    Web Based Semantic Communities – Who, How and Why We Might Want Them in the First Place

    No full text
    This paper describes an investigation undertaken as part of the FicNet Human-Computer Interaction project into the online amateur fiction community. By working with the community to determine current practices and areas of concern we consider how future technologies such as the semantic web might be used to design applications to support the community. As a first step in this process we gathered opinions both from members of the community and from those outside the community who had come into contact with it. Taking this information we consider the community as it is and what it might become

    Bringing Communities to the Semantic Web and the Semantic Web to Communities

    No full text
    In this paper we consider the types of community networks that are most often codified within the Semantic Web. We propose the recognition of a new structure which fulfils the definition of community used outside the SemanticWeb. We argue that the properties inherent in a community allow additional processing to be done with the described relationships existing between entities within the community network. Taking an existing online community as a case study we describe the ontologies and applications that we developed to support this community in the Semantic Web environment and discuss what lessons can be learnt from this exercise and applied in more general settings

    Freedom and Restraint: Tags, Vocabularies and Ontologies

    No full text
    The benefit of metadata is widely recognized. However, the nature of that information and the method of production remains a topic of some debate. This division is most noticeable between those who believe in ’free tagging’, and those who prefer the more formal construction of an ontology to define both the vocabulary of the domain and the relationships between the concepts within it. Looking at the community surrounding online amateur authors and the descriptive metadata they have developed over the last thirty years we consider what we can learn from a mature but amateur tagging community. This paper considers how these two systems might be used together to add the easy usability of free tagging to ontology descriptions and the conceptual richness of ontologies to free tags

    An ultra scale-down analysis of the recovery by dead-end centrifugation of human cells for therapy.

    Get PDF
    An ultra scale-down method is described to determine the response of cells to recovery by dead-end (batch) centrifugation under commercially defined manufacturing conditions. The key variables studied are the cell suspension hold time prior to centrifugation, the relative centrifugal force (RCF), time of centrifugation, cell pellet resuspension velocities, and number of resuspension passes. The cell critical quality attributes studied are the cell membrane integrity and the presence of selected surface markers. Greater hold times and higher RCF values for longer spin times all led to the increased loss of cell membrane integrity. However, this loss was found to occur during intense cell resuspension rather than the preceding centrifugation stage. Controlled resuspension at low stress conditions below a possible critical stress point led to essentially complete cell recovery even at conditions of extreme centrifugation (e.g., RCF of 10000 g for 30 mins) and long (~2 h) holding times before centrifugation. The susceptibility to cell loss during resuspension under conditions of high stress depended on cell type and the age of cells before centrifugation and the level of matrix crosslinking within the cell pellet as determined by the presence of detachment enzymes or possibly the nature of the resuspension medium. Changes in cell surface markers were significant in some cases but to a lower extent than loss of cell membrane integrity. Biotechnol. Bioeng. 2015;112: 997-1011. © 2014 Wiley Periodicals, Inc

    Effective forecasting for supply-chain planning: an empirical evaluation and strategies for improvement

    Get PDF
    Demand forecasting is a crucial aspect of the planning process in supply-chain companies. The most common approach to forecasting demand in these companies involves the use of a simple univariate statistical method to produce a forecast and the subsequent judgmental adjustment of this by the company's demand planners to take into account market intelligence relating to any exceptional circumstances expected over the planning horizon. Based on four company case studies, which included collecting more than 12,000 forecasts and outcomes, this paper examines: i) the extent to which the judgmental adjustments led to improvements in accuracy, ii) the extent to which the adjustments were biased and inefficient, iii) the circumstances where adjustments were detrimental or beneficial, and iv) methods that could lead to greater levels of accuracy. It was found that the judgmentally adjusted forecasts were both biased and inefficient. In particular, market intelligence that was expected to have a positive impact on demand was used far less effectively than intelligence suggesting a negative impact. The paper goes on to propose a set of improvements that could be applied to the forecasting processes in the companies and to the forecasting software that is used in these processes

    Mach-Zehnder optical configuration with Brewster window and two quarter-wave plates

    Get PDF
    Configuration is improvement because of the following: It provides higher efficiency. It reduces or eliminates feedthrough of untranslated local oscillator, which would produce a beat signal at shifted frequency of translator. When used without translator and with low-power detector, telescope secondary mirror reflects portion of output to local oscillator

    Measuring Planck beams with planets

    Get PDF
    Aims. Accurate measurement of the cosmic microwave background (CMB) anisotropy requires precise knowledge of the instrument beam. We explore how well the Planck beams will be determined from observations of planets, developing techniques that are also appropriate for other experiments. Methods. We simulate planet observations with a Planck-like scanning strategy, telescope beams, noise, and detector properties. Then we employ both parametric and non-parametric techniques, reconstructing beams directly from the time-ordered data. With a faithful parameterization of the beam shape, we can constrain certain detector properties, such as the time constants of the detectors, to high precision. Alternatively, we decompose the beam using an orthogonal basis. For both techniques, we characterize the errors in the beam reconstruction with Monte Carlo realizations. For a simplified scanning strategy, we study the impact on estimation of the CMB power spectrum. Finally, we explore the consequences for measuring cosmological parameters, focusing on the spectral index of primordial scalar perturbations, n_s. Results. The quality of the power spectrum measurement will be significantly influenced by the optical modeling of the telescope. In our most conservative case, using no information about the optics except the measurement of planets, we find that a single transit of Jupiter across the focal plane will measure the beam window functions to better than 0.3% for the channels at 100–217 GHz that are the most sensitive to the CMB. Constraining the beam with optical modeling can lead to much higher quality reconstruction. Conclusions. Depending on the optical modeling, the beam errors may be a significant contribution to the measurement systematics for n_s

    The development and characteristics of a hand-held high power diode laser-based industrial tile grout removal and single-stage sealing system

    Get PDF
    As the field of laser materials processing becomes ever more diverse, the high power diode laser (HPDL) is now being regarded by many as the most applicable tool. The commercialisation of an industrial epoxy grout removal and single-stage ceramic tile grout sealing process is examined through the development of a hand-held HPDL device in this work. Further, an appraisal of the potential hazards associated with the use of the HPDL in an industrial environment and the solutions implemented to ensure that the system complies with the relevant safety standards are given. The paper describes the characteristics and feasibility of the industrial epoxy grout removal process. A minimum power density of approximately 3 kW/cm2 was found to exist, whilst the minimum interaction time, below which there was no removal of epoxy tile grout, was found to be approximately 0.5 s. The maximum theoretical removal rate that may be achievable was calculated as being 65.98 mm2/s for a circular 2 mm diameter beam with a power density of 3 kW/cm2 and a traverse speed of 42 mm/s. In addition, the characteristics of the single-stage ceramic tile grout sealing are outlined. The single-stage ceramic tile grout sealing process yielded crack and porosity free seals which were produced in normal atmospheric conditions. Tiles were successfully sealed with power densities as low as 550 W/cm2 and at rates of up to 420 mm/min. In terms of mechanical, physical and chemical characteristics, the single-stage ceramic tile grout was found to be far superior to the conventional epoxy tile grout and, in many instances, matched and occasionally surpassed that of the ceramic tiles themselves
    corecore