2,321 research outputs found

    Crafting Collaboratively with Technologies of Re-production

    Get PDF
    This short paper draws from and compares two projects involving the authors in which digital and analogue reproduction technologies were used in collaborations with artists. In the first, artists were recruited to participate in iPad painting workshops and try out populist painting apps. The second project involved the earliest print technology, the woodcut. Coloured inks, rollers and wooden spoons were utilised by the first author in her role as "master printer", "pulling" limited edition prints—by hand—from blocks of incised wood in commercial fine art production. Digitisation facilitates massive and instantaneous copying and distribution without any loss of quality. By limiting reproduction and dissemination of prints, each one becomes more collectable and valuable. The paper considers how the inherent material degradation of traditional printmaking is a condition to which digital processes might aspire

    Play and Learn: Using Video Games to Train Computer Vision Models

    Full text link
    Video games are a compelling source of annotated data as they can readily provide fine-grained groundtruth for diverse tasks. However, it is not clear whether the synthetically generated data has enough resemblance to the real-world images to improve the performance of computer vision models in practice. We present experiments assessing the effectiveness on real-world data of systems trained on synthetic RGB images that are extracted from a video game. We collected over 60000 synthetic samples from a modern video game with similar conditions to the real-world CamVid and Cityscapes datasets. We provide several experiments to demonstrate that the synthetically generated RGB images can be used to improve the performance of deep neural networks on both image segmentation and depth estimation. These results show that a convolutional network trained on synthetic data achieves a similar test error to a network that is trained on real-world data for dense image classification. Furthermore, the synthetically generated RGB images can provide similar or better results compared to the real-world datasets if a simple domain adaptation technique is applied. Our results suggest that collaboration with game developers for an accessible interface to gather data is potentially a fruitful direction for future work in computer vision.Comment: To appear in the British Machine Vision Conference (BMVC), September 2016. -v2: fixed a typo in the reference

    Infrared laser desorption/ionization mass spectrometry: fundamental and applications

    Get PDF
    This dissertation on infrared laser desorption/ionization mass spectrometry encompasses fundamental studies of the desorption/ionization process and direct-from-gel laser desorption/ionization applications. Understanding of the fundamentals behind desorption and ionization can lead to improvements in the technique and to new applications. Experiments aimed at advancing this goal are wavelength studies and two-laser infrared/ultraviolet matrix-assisted laser desorption/ionization experiments. The direct-from-gel laser desorption/ionization application in which infrared laser desorption/ionization of analytes directly ionized after gel electrophoretic separation will also be presented. Protein and peptide standards were used as test analytes and the infrared lasers used were a 10.6 micrometers carbon-dioxide laser and a mid-IR optical parametric oscillator. In the wavelength experiments, the optical parametric oscillator was tuned from 2.8 to 3.6 micrometers and the minimum laser fluence to produce a detectable ion signal (threshold fluence) was recorded. Comparison of the threshold fluence to the infrared absorption of the sample indicates that the analyte is absorbing the laser light. Scanning electron microscopy images of the sample after laser irradiation show melting and indications of explosive boiling. It is concluded from these results that ionization occurs through the sacrifice of some of the protein molecules that absorb the laser energy and act as an intrinsic matrix. In the two-laser experiments, a mixture of analyte with a laser light-absorbing matrix was deposited on the sample target and irradiated with an infrared laser, followed, after an adjustable delay, by an ultraviolet nitrogen laser. Laser fluences were attenuated below the one-laser ionization threshold and two-laser signal was obtained at delays up to several hundred microseconds. The results can be explained by infrared laser heating of the sample that leads to an enhancement of ultraviolet matrix-assisted laser desorption/ionization. Direct-from-gel laser desorption/ionization experiments used the optical parametric oscillator to ionize electrophoreticly separated biomolecules directly from conventional gel slabs and capillary gels in plastic microfluidic chips. An increase in sensitivity was found when moving to the microfluidic chip design from analyses using gel slabs. This technique shows promise for the identification of both parent and fragment masses of proteins contained in gels

    Handling protest responses in contingent valuation surveys

    Get PDF
    OBJECTIVES: Protest responses, whereby respondents refuse to state the value they place on the health gain, are commonly encountered in contingent valuation (CV) studies, and they tend to be excluded from analyses. Such an approach will be biased if protesters differ from non-protesters on characteristics that predict their responses. The Heckman selection model has been commonly used to adjust for protesters, but its underlying assumptions may be implausible in this context. We present a multiple imputation (MI) approach to appropriately address protest responses in CV studies, and compare it with the Heckman selection model. METHODS: This study exploits data from the multinational EuroVaQ study, which surveyed respondents' willingness-to-pay (WTP) for a Quality Adjusted Life Year (QALY). Here, our simulation study assesses the relative performance of MI and Heckman selection models across different realistic settings grounded in the EuroVaQ study, including scenarios with different proportions of missing data and non-response mechanisms. We then illustrate the methods in the EuroVaQ study for estimating mean WTP for a QALY gain. RESULTS: We find that MI provides lower bias and mean squared error compared with the Heckman approach across all considered scenarios. The simulations suggest that the Heckman approach can lead to considerable underestimation or overestimation of mean WTP due to violations in the normality assumption, even after log-transforming the WTP responses. The case study illustrates that protesters are associated with a lower mean WTP for a QALY gain compared with non-protesters, but that the results differ according to method for handling protesters. CONCLUSIONS: MI is an appropriate method for addressing protest responses in CV studies

    Late Quaternary palaeoceanography of the Benguela upwelling system

    Get PDF

    Object replication in a distributed system

    Get PDF
    PhD ThesisA number of techniques have been proposed for the construction of fault—tolerant applications. One of these techniques is to replicate vital system resources so that if one copy fails sufficient copies may still remain operational to allow the application to continue to function. Interactions with replicated resources are inherently more complex than non—replicated interactions, and hence some form of replication transparency is necessary. This may be achieved by employing replica consistency protocols to mask replica failures and maintain consistency of state between functioning replicas. To achieve consistency between replicas it is necessary to ensure that all replicas receive the same set of messages in the same order, despite failures at the senders and receivers. This can be accomplished by making use of order preserving reliable communication protocols. However, we shall show how it can be more efficient to use unordered reliable communication and to impose ordering at the application level, by making use of syntactic knowledge of the application. This thesis develops techniques for replicating objects: in general this is harder than replicating data, as objects (which can contain data) can contain calls on other objects. Handling replicated objects is essentially the same as handling replicated computations, and presents more problems than simply replicating data. We shall use the concept of the object to provide transparent replication to users: a user will interact with only a single object interface which hides the fact that the object is actually replicated. The main aspects of the replication scheme presented in this thesis have been fully implemented and tested. This includes the design and implementation of a replicated object invocation protocol and the algorithms which ensure that (replicated) atomic actions can manipulate replicated objects.Research Studentship, Science and Engineering Research Council. Esprit Project 2267 (Integrated Systems Architecture)
    • …
    corecore