748 research outputs found

    The Rise Of The Exciton In Solid Ammonia

    Get PDF
    We study the dynamics of a system searching for minutes to hours to establish a population of nuclei, which can then go on to create a phase change, following the rise of an exciton in thin films of solid ammonia with deposition temperatures Td_{d} = 48 K, 50 K and 52 K. This behaviour is tracked by following the growth of the exciton, using vacuum ultraviolet absorption spectra of ices at 194.4 nm in the \~{A}1^{1}A2_{2}" ←\leftarrow \~{X}1^{1}A' band. Absorbance is observed to increase through an order of magnitude between Td = 48K to 52K, through greater flexing of the solid state structure, as the size of crystallites expands from an average of 10±\pm 2 unit cells at 48K to 34±\pm 8 at 52K. Time delays, associated with nucleation, are encountered before the appearance of exciton absorption, varying between 7870 seconds at 48K to 120 seconds at 52K, with rates of subsequent exciton absorbance growth between 1.49 x 10−6^{-6} s−1^{-1} and 1.19 x 10−4^{-4} s−1^{-1}. Activation energies of 21.7±\pm 0.2 kJ mol−1^{-1} for the nucleation process and 22.8±\pm 0.2 kJ mol−1^{-1} for the phase change are derived, corresponding to the breaking of two to three hydrogen bonds. Results demonstrate a new means to track nucleation and recrystallization rates in polymorphic systems

    Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware

    Full text link
    In recent years the field of neuromorphic low-power systems that consume orders of magnitude less power gained significant momentum. However, their wider use is still hindered by the lack of algorithms that can harness the strengths of such architectures. While neuromorphic adaptations of representation learning algorithms are now emerging, efficient processing of temporal sequences or variable length-inputs remain difficult. Recurrent neural networks (RNN) are widely used in machine learning to solve a variety of sequence learning tasks. In this work we present a train-and-constrain methodology that enables the mapping of machine learned (Elman) RNNs on a substrate of spiking neurons, while being compatible with the capabilities of current and near-future neuromorphic systems. This "train-and-constrain" method consists of first training RNNs using backpropagation through time, then discretizing the weights and finally converting them to spiking RNNs by matching the responses of artificial neurons with those of the spiking neurons. We demonstrate our approach by mapping a natural language processing task (question classification), where we demonstrate the entire mapping process of the recurrent layer of the network on IBM's Neurosynaptic System "TrueNorth", a spike-based digital neuromorphic hardware architecture. TrueNorth imposes specific constraints on connectivity, neural and synaptic parameters. To satisfy these constraints, it was necessary to discretize the synaptic weights and neural activities to 16 levels, and to limit fan-in to 64 inputs. We find that short synaptic delays are sufficient to implement the dynamical (temporal) aspect of the RNN in the question classification task. The hardware-constrained model achieved 74% accuracy in question classification while using less than 0.025% of the cores on one TrueNorth chip, resulting in an estimated power consumption of ~17 uW

    Tweets as impact indicators: Examining the implications of automated bot accounts on Twitter

    Get PDF
    This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific papers deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. We present preliminary findings that automated Twitter accounts create a considerable amount of tweets to scientific papers and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose differentiating between different levels of engagement from tweeting only bibliographic information to discussing or commenting on the content of a paper.Comment: 9 pages, 4 figures, 1 tabl

    Safety, play, enablement, and active involvement: Themes from a Grounded Theory study of practitioner and client experiences of change processes in Dramatherapy

    Get PDF
    Objective: This study aims to investigate how dramatherapists and dramatherapy clients experience change in therapy and whether change processes identified are consistent across dramatherapeutic approaches. Method: Seven dramatherapists and seven dramatherapy clients were interviewed about their experiences of dramatherapy. Using a grounded theory method three core themes were constructed from the data. Results: The resulting core categories – 1. working within a safe distance; 2. the client being allowed and allowing self to play and try out new ways of being and 3. being actively involved in therapy: creating something visible and having physical experiences using the body, capture the experience of change for both dramatherapists and clients in therapy. Key change mechanisms were also proposed, these included: developing new awareness and finding a language to communicate. Main conclusions: A focus on developing new awareness and increased insight into self are important outcomes for therapy and need to be clearly communicated as such. Future research should include further exploration of the key themes identified and the client developing increased reflective functioning as a key change mechanism during dramatherapy

    Characterisation of the Colour Doppler Twinkle Artefact

    Get PDF
    This investigation involved the development of a range of Colour Doppler Twinkle Artefact phantoms to characterise and quantify the “Twinkle” artefact which is often present when an irregular structure is encountered in the imaged field of view. The artefact occurs in both colour and power Doppler ultrasound imaging and manifests as a false depiction of colour velocity information in stationary soft tissue and therefore can cause significant misdiagnosis of areas of flow within the patient. It has been hypothesised that it is generated due to a strongly reflecting medium composed of individual reflectors and therefore becomes a clinical concern when parenchymal calcifications are encountered. (Tsao et al., 2006). The aim of this study was to investigate the occurrence and magnitude of this artefact across a range of ultrasound scanners and to monitor the effects on the artefact of varying image acquisition parameters. A range of phantoms were produced that could reproducibly recreate the Twinkle artefact, the presence of which was quantified in a range of scanners (Zonare, Siemens Antares, Philips HDI and IU22). These phantoms included both fine and coarse structures as well as a flow channel in one of the phantoms, through which blood mimicking fluid was pumped. A semi-quantitative grading system was implemented and instrument controls such as pulse repetition frequency (PRF), colour write priority, greyscale gain and depth of focal zone were varied in order to determine their impact on the Twinkle artefact. Instrument control settings were found to significantly affect the intensity of this artefact, predominantly PRF showed a significant increase in the presence of this artefact. Furthermore, the extent of the artefact varied greatly across the range of scanners with Siemens Antaries and Zonare being most sensitive to the artefact. The implication of this study has shown the Twinkle artefact to be dependent on scanner specifications and instrumental parameters. With careful image optimisation, a reduction or elimination of the artefact can be achieved

    EXPLORING VIENNESE TUNING AND ITS BENEFITS FOR THE MODERN DOUBLE BASSIST

    Get PDF
    The role of the double bass in Vienna during the eighteenth century evolved significantly between 1760 and 1812. During these years, Viennese composers began to view the double bass less as an accompanimental instrument and more as a solo voice. Despite the abundance of music written for the double bass during this time, few of these compositions are regularly performed today. This dissertation serves three purposes. I explore how learning eighteenth-century Viennese compositions in the original tuning can influence modern performances of these works. Secondly, I document the arrangement of a lesser-know work for the modern tuned bass using the manuscript as the source material. Finally, by performing a variety of eighteenth-century bass works, I bring this music to the public's attention. The research for this dissertation has been presented in two forms. The recitals present both solo and chamber works from eighteenth-century Vienna. The repertoire for the three recitals was chosen so that each recital addresses one of the three purposes mentioned above. The research paper presents performance practices of the eighteenth century, challenges the modern double bassist faces when playing this literature, as well as a look into how to arrange one of these works for the modern tuned double bass. The three recitals were performed on the campus of the University of Maryland in the Leah M. Smith Hall, Gildenhorn Recital Hall and the Ulrich Recital Hall, respectively. Recordings of all three recitals can be found in the Digital Repository at the University of Maryland (DRUM)
    • 

    corecore