142 research outputs found

    Passive Techniques for Detecting and Locating Manipulations in Digital Images

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Informática, leída el 19-11-2020El numero de camaras digitales integradas en dispositivos moviles as como su uso en la vida cotidiana esta en continuo crecimiento. Diariamente gran cantidad de imagenes digitales, generadas o no por este tipo de dispositivos, circulan en Internet o son utilizadas como evidencias o pruebas en procesos judiciales. Como consecuencia, el analisis forense de imagenes digitales cobra importancia en multitud de situaciones de la vida real. El analisis forense de imagenes digitales se divide en dos grandes ramas: autenticidad de imagenes digitales e identificacion de la fuente de adquisicion de una imagen. La primera trata de discernir si una imagen ha sufrido algun procesamiento posterior al de su creacion, es decir, que no haya sido manipulada. La segunda pretende identificar el dispositivo que genero la imagen digital. La verificacion de la autenticidad de imagenes digitales se puedellevar a cabo mediante tecnicas activas y tecnicas pasivas de analisis forense. Las tecnicas activas se fundamentan en que las imagenes digitales cuentan con \marcas" presentes desde su creacion, de forma que cualquier tipo de alteracion que se realice con posterioridad a su generacion, modificara las mismas, y, por tanto, permitiran detectar si ha existido un posible post-proceso o manipulacion...The number of digital cameras integrated into mobile devices as well as their use in everyday life is continuously growing. Every day a large number of digital images, whether generated by this type of device or not, circulate on the Internet or are used as evidence in legal proceedings. Consequently, the forensic analysis of digital images becomes important in many real-life situations. Forensic analysis of digital images is divided into two main branches: authenticity of digital images and identi cation of the source of acquisition of an image. The first attempts to discern whether an image has undergone any processing subsequent to its creation, i.e. that it has not been manipulated. The second aims to identify the device that generated the digital image. Verification of the authenticity of digital images can be carried out using both active and passive forensic analysis techniques. The active techniques are based on the fact that the digital images have "marks"present since their creation so that any type of alteration made after their generation will modify them, and therefore will allow detection if there has been any possible post-processing or manipulation. On the other hand, passive techniques perform the analysis of authenticity by extracting characteristics from the image...Fac. de InformáticaTRUEunpu

    Continuous Variable Optimisation of Quantum Randomness and Probabilistic Linear Amplification

    Get PDF
    In the past decade, quantum communication protocols based on continuous variables (CV) has seen considerable development in both theoretical and experimental aspects. Nonetheless, challenges remain in both the practical security and the operating range for CV systems, before such systems may be used extensively. In this thesis, we present the optimisation of experimental parameters for secure randomness generation and propose a non-deterministic approach to enhance amplification of CV quantum state. The first part of this thesis examines the security of quantum devices: in particular, we investigate quantum random number generators (QRNG) and quantum key distribution (QKD) schemes. In a realistic scenario, the output of a quantum random number generator is inevitably tainted by classical technical noise, which potentially compromises the security of such a device. To safeguard against this, we propose and experimentally demonstrate an approach that produces side-information independent randomness. We present a method for maximising such randomness contained in a number sequence generated from a given quantum-to-classical-noise ratio. The detected photocurrent in our experiment is shown to have a real-time random-number generation rate of 14 (Mbit/s)/MHz. Next, we study the one-sided device-independent (1sDI) quantum key distribution scheme in the context of continuous variables. By exploiting recently proven entropic uncertainty relations, one may bound the information leaked to an eavesdropper. We use such a bound to further derive the secret key rate, that depends only upon the conditional Shannon entropies accessible to Alice and Bob, the two honest communicating parties. We identify and experimentally demonstrate such a protocol, using only coherent states as the resource. We measure the correlations necessary for 1sDI key distribution up to an applied loss equivalent to 3.5 km of fibre transmission. The second part of this thesis concerns the improvement in the transmission of a quantum state. We study two approximate implementations of a probabilistic noiseless linear amplifier (NLA): a physical implementation that truncates the working space of the NLA or a measurement-based implementation that realises the truncation by a bounded postselection filter. We do this by conducting a full analysis on the measurement-based NLA (MB-NLA), making explicit the relationship between its various operating parameters, such as amplification gain and the cut-off of operating domain. We compare it with its physical counterpart in terms of the Husimi Q-distribution and their probability of success. We took our investigations further by combining a probabilistic NLA with an ideal deterministic linear amplifier (DLA). In particular, we show that when NLA gain is strictly lesser than the DLA gain, this combination can be realised by integrating an MB-NLA in an optical DLA setup. This results in a hybrid device which we refer to as the heralded hybrid quantum amplifier. A quantum cloning machine based on this hybrid amplifier is constructed through an amplify-then-split method. We perform probabilistic cloning of arbitrary coherent states, and demonstrate the production of up to five clones, with the fidelity of each clone clearly exceeding the corresponding no-cloning limit

    Improving the sensitivity of future GW observatories in the 1-10 Hz band: Newtonian and seismic noise

    Get PDF
    The next generation gravitational wave interferometric detectors will likely be underground detectors to extend the GW detection frequency band to frequencies below the Newtonian noise limit. Newtonian noise originates from the continuous motion of the Earth’s crust driven by human activity, tidal stresses and seismic motion, and from mass density fluctuations in the atmosphere. It is calculated that on Earth’s surface, on a typical day, it will exceed the expected GW signals at frequencies below 10 Hz. The noise will decrease underground by an unknown amount. It is important to investigate and to quantify this expected reduction and its effect on the sensitivity of future detectors, to plan for further improvement strategies. We report about some of these aspects. Analytical models can be used in the simplest scenarios to get a better qualitative and semi-quantitative understanding. As more complete modeling can be done numerically, we will discuss also some results obtained with a finite-element-based modeling tool. The method is verified by comparing its results with the results of analytic calculations for surface detectors. A key point about noise models is their initial parameters and conditions, which require detailed information about seismic motion in a real scenario. We will describe an effort to characterize the seismic activity at the Homestake mine which is currently in progress. This activity is specifically aimed to provide informations and to explore the site as a possible candidate for an underground observatory. Although the only compelling reason to put the interferometer underground is to reduce the Newtonian noise, we expect that the more stable underground environment will have a more general positive impact on the sensitivity.We will end this report with some considerations about seismic and suspension noise

    Detector Improvements and Optimization to Advance Gravitational-wave Astronomy

    Get PDF
    The thesis covers a range of topics relevant to the current and future gravitational-wave facilities. After the last science observing run, O3, that ended in March 2020, the aLIGO and VIRGO gravitational-wave detectors are undergoing upgrades to improve their sensitivity. My thesis focuses on the work done at the LIGO Hanford Observatory to facilitate these upgrade activities. I worked to develop two novel technologies with applications to gravitational-wave detectors. First, I developed a high-bandwidth, low-noise, flexure-based piezo-deformable mirror for active mode-matching. Mode-matching losses limit improvements from squeezing as they distort the ground state of the squeezed beam. For broadband sensitivity improvements from frequency-dependent squeezing, it is critical to ensure low mode-mismatch losses. These piezo-deformable mirrors are being installed at the aLIGO facilities. Second, I worked to develop and test a high-resolution wavefront sensor that employs a time-of-flight sensor. By achieving phase-locking between the demodulation signal for the time-of-flight sensor and the incident modulated laser beam, this camera is capable of sensing higher-order mode distortions of the incident beam. Cosmic Explorer is a proposed next-generation gravitational-wave observatory in the United States that is planned to be operational by the mid-2030s. Cosmic Explorer along with Einstein Telescope will form a network of next-generation gravitational-wave detectors. I propose the science-goal-focused tunable design of the Cosmic Explorer detectors that allow for the possibility to tune with sensitivity at low, mid, and high frequencies. These tuning options give Cosmic Explorer the flexibility to target a diverse set of science goals with the same detector infrastructure. The technological challenges to achieving these tunable configurations are presented. I find that a 40 km Cosmic Explorer detector outperforms a 20 km in all key science goals other than access to post-merger physics. This suggests that Cosmic Explorer should include at least one 40 km facility. I also explore the detection prospects of core-collapse supernovae with the third-generation facilities -- Cosmic Explorer and Einstein Telescope. I find that the weak gravitational-wave signature from core-collapse supernovae limits the likely sources within our galaxy. This corresponds to a low event rate of two per century

    Post-selection-based continuous variable quantum information processing

    Get PDF
    Quantum communication and computation harness the intriguing and bewildering nature of quantum mechanics to realize information processing tasks that have no classical analog. Nonetheless, this supremacy comes with fundamental limits that, in some scenarios, pose undesirable bounds on the performance of these quantum technologies. One such example is the well-known quantum no-cloning theorem imposed by the Heisenberg uncertainty principle. It states that an unknown quantum state cannot be duplicated with arbitrarily high accuracy. Very recently, however, post-selection was proposed as a way out: it was demonstrated that in various quantum information tasks, deterministic bounds can be overcome by forgoing determinism. In this thesis, we investigate post-selection as a novel approach to enhance the performance of versatile continuous-variable (CV) quantum information processing and envisage it to become a useful component of the general Gaussian toolbox. The first part of this thesis examines applications of post-selection in purely linear systems. In particular, two implementations of the noiseless linear amplifier (NLA), the measurement-based NLA and the physical NLA, are investigated and compared in terms to their abilities to preserve the state Gaussianity and their success probability. We show that the inevitable signal-to-noise ratio (SNR) degradation accompanying a linear quantum amplifier can be circumvented by resorting to a probabilistic scheme. Amplification with a signal transfer coefficient of Ts>1 is realised by combining a measurement-based NLA with a deterministic linear amplifier. We also construct a quantum cloning machine based on this hybrid amplifier for arbitrary coherent input states. We demonstrate a production of multiple clones (up to five) with fidelity of each clone exceeding the corresponding no-cloning limit. We then consider employing the post-selection algorithm in information protocols involving nonlinearity. First, we develop two squeezers as optical parametric amplifiers, each producing fairly pure squeezed output field up to 11.2dB (after correcting the detection loss). The squeezers are served as the nonlinear source in the remaining part of this thesis. We demonstrate a high fidelity quantum squeezing gate which is one indispensible building block for constructing a universal CV quantum computer. An inverse-Gaussian filter is incorporated into the feedforward line, leading to an enhancement in precision of the inline dual-homodyne measurement and therefore combats efficiently the correlation degradation due to loss and noise introduced during feedforward. As one example, we show that a fidelity of 98.49% for a target squeezing of -2.3dB is obtained with only -6dB ancilla squeezing, which would otherwise require -20.5dB initial squeezing using a conventional deterministic setup. Additionally, we introduce a CV quantum teleportation scheme using post-selection, allowing for a significantly improved fidelity against the conventional deterministic CV teleporter. The intuition behind this improvement is that post-selection effectively distilled the accessible entanglement and therefore a high fidelity only originally achievable with a higher amount of initial squeezing is now obtainable with only modest amount of squeezing, coming at an expense of finite success probability

    PRISM (Polarized Radiation Imaging and Spectroscopy Mission): an extended white paper

    Get PDF
    PRISM (Polarized Radiation Imaging and Spectroscopy Mission) was proposed to ESA in May 2013 as a large-class mission for investigating within the framework of the ESA Cosmic Vision program a set of important scientific questions that require high res- olution, high sensitivity, full-sky observations of the sky emission at wavelengths ranging from millimeter-wave to the far-infrared. PRISM’s main objective is to explore the distant universe, probing cosmic history from very early times until now as well as the structures, distribution of matter, and velocity flows throughout our Hubble volume. PRISM will survey the full sky in a large number of frequency bands in both intensity and polarization and will measure the absolute spectrum of sky emission more than three orders of magnitude bet- ter than COBE FIRAS. The data obtained will allow us to precisely measure the absolute sky brightness and polarization of all the components of the sky emission in the observed frequency range, separating the primordial and extragalactic components cleanly from the galactic and zodiacal light emissions. The aim of this Extended White Paper is to provide a more detailed overview of the highlights of the new science that will be made possible by PRISM, which include: (1) the ultimate galaxy cluster survey using the Sunyaev-Zeldovich (SZ) e↵ect, detecting approximately 106 clusters extending to large redshift, including a char- acterization of the gas temperature of the brightest ones (through the relativistic corrections to the classic SZ template) as well as a peculiar velocity survey using the kinetic SZ e↵ect that comprises our entire Hubble volume; (2) a detailed characterization of the properties and evolution of dusty galaxies, where the most of the star formation in the universe took place, the faintest population of which constitute the di↵use CIB (Cosmic Infrared Background); (3) a characterization of the B modes from primordial gravity waves generated during inflation and from gravitational lensing, as well as the ultimate search for primordial non-Gaussianity using CMB polarization, which is less contaminated by foregrounds on small scales than thetemperature anisotropies; (4) a search for distortions from a perfect blackbody spectrum, which include some nearly certain signals and others that are more speculative but more informative; and (5) a study of the role of the magnetic field in star formation and its inter- action with other components of the interstellar medium of our Galaxy. These are but a few of the highlights presented here along with a description of the proposed instrument

    Quantum optomechanics in the unresolved sideband regime

    Get PDF
    corecore