513 research outputs found

    Observations and radiative transfer modelling of a massive dense cold core in G333

    Full text link
    Cold massive cores are one of the earliest manifestations of high mass star formation. Following the detection of SiO emission from G333.125-0.562, a cold massive core, further investigations of the physics, chemistry and dynamics of this object has been carried out. Mopra and NANTEN2 molecular line profile observations, Australia Telescope Compact Array (ATCA) line and continuum emission maps, and Spitzer 24 and 70 \mum images were obtained. These new data further constrain the properties of this prime example of the very early stages of high mass star formation. A model for the source was constructed and compared directly with the molecular line data using a 3D molecular line transfer code - MOLLIE. The ATCA data reveal that G333.125-0.562 is composed of two sources. One of the sources is responsible for the previously detected molecular outflow and is detected in the Spitzer 24 and 70 \mum band data. Turbulent velocity widths are lower than other more active regions of G333 which reflects the younger evolutionary stage and/or lower mass of this core. The molecular line modelling requires abundances of the CO isotopes that strongly imply heavy depletion due to freeze-out of this species onto dust grains. The principal cloud is cold, moderately turbulent and possesses an outflow which indicates the presence of a central driving source. The secondary source could be an even less evolved object as no apparent associations with continuum emissions at (far-)infrared wavelengths.Comment: 10 pages, accepted to MNRA

    Young people\u27s views regarding participation in mental health and wellbeing research through social media

    Get PDF
    Social media is a central component in the lives of many young people, and provides innovative potential to conduct research among this population. Ethical issues around online research have been subject to much debate, yet young people have seldom been consulted to provide a youth perspective and voice. Eight (8) focus groups involving 48 Grade 9 Western Australian secondary school students aged 13-14 years were held in 2012, to investigate how young people perceive the feasibility and acceptability of social media when used as a research tool to investigate various issues relevant to their mental health and wellbeing. Whilst young people recognise many benefits of researchers using social media in this way, such as its relevance, innovation and accessibility, there were salient issues of privacy, consent, and practicality that require careful negotiation. There is a need for continued exploration and scientific debate of the moral and ethical implications of using social media for research, to help ensure this is employed in an appropriate and effective way that is respectful of and sensitive to the needs and views of young people

    Genuine Counterfactual Communication with a Nanophotonic Processor

    Full text link
    In standard communication information is carried by particles or waves. Counterintuitively, in counterfactual communication particles and information can travel in opposite directions. The quantum Zeno effect allows Bob to transmit a message to Alice by encoding information in particles he never interacts with. The first suggested protocol not only required thousands of ideal optical components, but also resulted in a so-called "weak trace" of the particles having travelled from Bob to Alice, calling the scalability and counterfactuality of previous proposals and experiments into question. Here we overcome these challenges, implementing a new protocol in a programmable nanophotonic processor, based on reconfigurable silicon-on-insulator waveguides that operate at telecom wavelengths. This, together with our telecom single-photon source and highly-efficient superconducting nanowire single-photon detectors, provides a versatile and stable platform for a high-fidelity implementation of genuinely trace-free counterfactual communication, allowing us to actively tune the number of steps in the Zeno measurement, and achieve a bit error probability below 1%, with neither post-selection nor a weak trace. Our demonstration shows how our programmable nanophotonic processor could be applied to more complex counterfactual tasks and quantum information protocols.Comment: 6 pages, 4 figure

    Experimental Quantum Hamiltonian Learning

    Get PDF
    Efficiently characterising quantum systems, verifying operations of quantum devices and validating underpinning physical models, are central challenges for the development of quantum technologies and for our continued understanding of foundational physics. Machine-learning enhanced by quantum simulators has been proposed as a route to improve the computational cost of performing these studies. Here we interface two different quantum systems through a classical channel - a silicon-photonics quantum simulator and an electron spin in a diamond nitrogen-vacancy centre - and use the former to learn the latter's Hamiltonian via Bayesian inference. We learn the salient Hamiltonian parameter with an uncertainty of approximately 10−510^{-5}. Furthermore, an observed saturation in the learning algorithm suggests deficiencies in the underlying Hamiltonian model, which we exploit to further improve the model itself. We go on to implement an interactive version of the protocol and experimentally show its ability to characterise the operation of the quantum photonic device. This work demonstrates powerful new quantum-enhanced techniques for investigating foundational physical models and characterising quantum technologies

    No imminent quantum supremacy by boson sampling

    Get PDF
    It is predicted that quantum computers will dramatically outperform their conventional counterparts. However, large-scale universal quantum computers are yet to be built. Boson sampling is a rudimentary quantum algorithm tailored to the platform of photons in linear optics, which has sparked interest as a rapid way to demonstrate this quantum supremacy. Photon statistics are governed by intractable matrix functions known as permanents, which suggests that sampling from the distribution obtained by injecting photons into a linear-optical network could be solved more quickly by a photonic experiment than by a classical computer. The contrast between the apparently awesome challenge faced by any classical sampling algorithm and the apparently near-term experimental resources required for a large boson sampling experiment has raised expectations that quantum supremacy by boson sampling is on the horizon. Here we present classical boson sampling algorithms and theoretical analyses of prospects for scaling boson sampling experiments, showing that near-term quantum supremacy via boson sampling is unlikely. While the largest boson sampling experiments reported so far are with 5 photons, our classical algorithm, based on Metropolised independence sampling (MIS), allowed the boson sampling problem to be solved for 30 photons with standard computing hardware. We argue that the impact of experimental photon losses means that demonstrating quantum supremacy by boson sampling would require a step change in technology.Comment: 25 pages, 9 figures. Comments welcom

    The potential use of service-oriented infrastructure framework to enable transparent vertical scalability of cloud computing infrastructure

    Get PDF
    Cloud computing technology has become familiar to most Internet users. Subsequently, there has been an increased growth in the use of cloud computing, including Infrastructure as a Service (IaaS). To ensure that IaaS can easily meet the growing demand, IaaS providers usually increase the capacity of their facilities in a vertical IaaS increase capability and the capacity for local IaaS amenities such as increasing the number of servers, storage and network bandwidth. However, at the same time, horizontal scalability is sometimes not enough and requires additional strategies to ensure that the large number of IaaS service requests can be met. Therefore, strategies requiring horizontal scalability are more complex than the vertical scalability strategies because they involve the interaction of more than one facility at different service centers. To reduce the complexity of the implementation of the horizontal scalability of the IaaS infrastructures, the use of a technology service oriented infrastructure is recommended to ensure that the interaction between two or more different service centers can be done more simply and easily even though it is likely to involve a wide range of communication technologies and different cloud computing management. This is because the service oriented infrastructure acts as a middle man that translates and processes interactions and protocols of different cloud computing infrastructures without the modification of the complex to ensure horizontal scalability can be run easily and smoothly. This paper presents the potential of using a service-oriented infrastructure framework to enable transparent vertical scalability of cloud computing infrastructures by adapting three projects in this research: SLA@SOI consortium, Open Cloud Computing Interface (OCCI), and OpenStack

    The MAST motional Stark effect diagnostic

    No full text
    A motional Stark effect (MSE) diagnostic is now installed and operating routinely on the MAST spherical tokamak, with 35 radial channels, spatial resolution of ∼2.5 cm, and time resolution of ∼1 ms at angular noise levels of ∼0.5°. Conventional (albeit very narrow) interference filters isolate π or σ polarized emission. Avalanche photodiode detectors with digital phase-sensitive detection measure the harmonics of a pair of photoelastic modulators operating at 20 and 23 kHz, and thus the polarization state. The π component is observed to be significantly stronger than σ, in reasonably good agreement with atomic physics calculations, and as a result, almost all channels are now operated on π. Trials with a wide filter that admits the entire Stark pattern (relying on the net polarization of the emission) have demonstrated performance almost as good as the conventional channels. MSE-constrained equilibrium reconstructions can readily be produced between pulses.This work was funded partly by the United Kingdom Engineering and Physical Sciences Research Council under Grant No. P/G003955 and by the European Communities under the contract of association between Euratom and CCFE
    • …
    corecore