3,754 research outputs found

    Pre-discovery and Follow-up Observations of the Nearby SN 2009nr: Implications for Prompt Type Ia SNe

    Full text link
    We present photometric and spectroscopic observations of the Type Ia supernova SN 2009nr in UGC 8255 (z=0.0122). Following the discovery announcement at what turned out to be ten days after peak, we detected it at V ~15.7 mag in data collected by the All Sky Automated Survey (ASAS) North telescope 2 weeks prior to the peak, and then followed it up with telescopes ranging in aperture from 10-cm to 6.5-m. Using early photometric data available only from ASAS, we find that the SN is similar to the over-luminous Type Ia SN 1991T, with a peak at Mv=-19.6 mag, and a slow decline rate of Dm_15(B)=0.95 mag. The early post-maximum spectra closely resemble those of SN 1991T, while the late time spectra are more similar to those of normal Type Ia SNe. Interestingly, SN 2009nr has a projected distance of 13.0 kpc (~4.3 disk scale lengths) from the nucleus of the small star-forming host galaxy UGC 8255. This indicates that the progenitor of SN 2009nr is not associated with a young stellar population, calling into question the conventional association of luminous SNe Ia with the "prompt" component directly correlated with current star formation. The pre-discovery observation of SN 2009nr using ASAS demonstrates the science utility of high cadence all sky surveys conducted using small telescopes for the discovery of nearby (d=<50 Mpc) supernovae.Comment: 11 pages, 11 figures, 4 tables. Accepted for publication in ApJ on 11/02/201

    A reduced-reference perceptual image and video quality metric based on edge preservation

    Get PDF
    In image and video compression and transmission, it is important to rely on an objective image/video quality metric which accurately represents the subjective quality of processed images and video sequences. In some scenarios, it is also important to evaluate the quality of the received video sequence with minimal reference to the transmitted one. For instance, for quality improvement of video transmission through closed-loop optimisation, the video quality measure can be evaluated at the receiver and provided as feedback information to the system controller. The original image/video sequence-prior to compression and transmission-is not usually available at the receiver side, and it is important to rely at the receiver side on an objective video quality metric that does not need reference or needs minimal reference to the original video sequence. The observation that the human eye is very sensitive to edge and contour information of an image underpins the proposal of our reduced reference (RR) quality metric, which compares edge information between the distorted and the original image. Results highlight that the metric correlates well with subjective observations, also in comparison with commonly used full-reference metrics and with a state-of-the-art RR metric. © 2012 Martini et al

    On the Temporality of Introducing Code Technical Debt

    Get PDF
    Code Technical Debt (TD) is intentionally or unintentionally created when developers introduce inefficiencies in the codebase. This can be attributed to various reasons such as heavy work-load, tight delivery schedule, unawareness of good practices, etc. To shed light into the context that leads to technical debt accumulation, in this paper we investigate: (a) the temporality of code technical debt introduction in new methods, i.e., whether the introduction of technical debt is stable across the lifespan of the project, or if its evolution presents spikes; and (b) the relation of technical debt introduction and the development team’s workload in a given period. To answer these questions, we perform a case study on twenty-seven Apache projects, and inspect the number of Technical Debt Items introduced in 6-month sliding temporal windows. The results of the study suggest that: (a) overall, the number of Technical Debt Items introduced through new code is a stable metric, although it presents some spikes; and (b) the number of commits performed is not strongly correlated to the number of introduced Technical Debt Items

    Investigation of ripple-limited low-field mobility in large-scale graphene nanoribbons

    Get PDF
    Combining molecular dynamics and quantum transport simulations, we study the degradation of mobility in graphene nanoribbons caused by substrate-induced ripples. First, the atom coordinates of large-scale structures are relaxed such that surface properties are consistent with those of graphene on a substrate. Then, the electron current and low-field mobility of the resulting non-flat nanoribbons are calculated within the Non-equilibrium Green\u27s Function formalism in the coherent transport limit. An accurate tight-binding basis coupling the sigma- and pi-bands of graphene is used for this purpose. It is found that the presence of ripples decreases the mobility of graphene nanoribbons on SiO2 below 3000 cm(2)/Vs, which is comparable to experimentally reported values. (C) 2013 AIP Publishing LLC

    Delayed - Choice Entanglement - Swapping with Vacuum-One Photon Quantum States

    Full text link
    We report the experimental realization of a recently discovered quantum information protocol by Asher Peres implying an apparent non-local quantum mechanical retrodiction effect. The demonstration is carried out by applying a novel quantum optical method by which each singlet entangled state is physically implemented by a two-dimensional subspace of Fock states of a mode of the electromagnetic field, specifically the space spanned by the vacuum and the one photon state, along lines suggested recently by E. Knill et al., Nature 409, 46 (2001) and by M. Duan et al., Nature 414, 413 (2001). The successful implementation of the new technique is expected to play an important role in modern quantum information and communication and in EPR quantum non-locality studies

    Empirical Evaluation of Deep Learning Approaches for Landmark Detection in Fish Bioimages

    Get PDF
    In this paper we perform an empirical evaluation of variants of deep learning methods to automatically localize anatomical landmarks in bioimages of fishes acquired using different imaging modalities (microscopy and radiography). We compare two methodologies namely heatmap based regression and multivariate direct regression, and evaluate them in combination with several Convolutional Neural Network (CNN) architectures. Heatmap based regression approaches employ Gaussian or Exponential heatmap generation functions combined with CNNs to output the heatmaps corresponding to landmark locations whereas direct regression approaches output directly the (x, y) coordinates corresponding to landmark locations. In our experiments, we use two microscopy datasets of Zebrafish and Medaka fish and one radiography dataset of gilthead Seabream. On our three datasets, the heatmap approach with Exponential function and U-Net architecture performs better. Datasets and open-source code for training and prediction are made available to ease future landmark detection research and bioimaging applications

    Violation of multi-particle Bell inequalities for low and high flux parametric amplification using both vacuum and entangled input states

    Get PDF
    We show how polarisation measurements on the output fields generated by parametric down conversion will reveal a violation of multi-particle Bell inequalities, in the regime of both low and high output intensity. In this case each spatially separated system, upon which a measurement is performed, is comprised of more than one particle. In view of the formal analogy with spin systems, the proposal provides an opportunity to test the predictions of quantum mechanics for spatially separated higher spin states. Here the quantum behaviour possible even where measurements are performed on systems of large quantum (particle) number may be demonstrated. Our proposal applies to both vacuum-state signal and idler inputs, and also to the quantum-injected parametric amplifier as studied by De Martini et al. The effect of detector inefficiencies is included.Comment: 12 pages, 12 figure

    Optimal Quantum Cloning via Stimulated Emission

    Get PDF
    We show that optimal universal quantum cloning can be realized via stimulated emission. Universality of the cloning procedure is achieved by choosing systems that have appropriate symmetries. We first discuss a scheme based on stimulated emission in certain three-level-systems, e.g. atoms in a cavity. Then we present a way of realizing optimal universal cloning based on stimulated parametric down-conversion. This scheme also implements the optimal universal NOT operation.Comment: 4 pages, 3 figure
    • …
    corecore