45 research outputs found

    Reconstruction of Convex Sets from One or Two X-rays

    Full text link
    We consider a class of problems of Discrete Tomography which has been deeply investigated in the past: the reconstruction of convex lattice sets from their horizontal and/or vertical X-rays, i.e. from the number of points in a sequence of consecutive horizontal and vertical lines. The reconstruction of the HV-convex polyominoes works usually in two steps, first the filling step consisting in filling operations, second the convex aggregation of the switching components. We prove three results about the convex aggregation step: (1) The convex aggregation step used for the reconstruction of HV-convex polyominoes does not always provide a solution. The example yielding to this result is called \textit{the bad guy} and disproves a conjecture of the domain. (2) The reconstruction of a digital convex lattice set from only one X-ray can be performed in polynomial time. We prove it by encoding the convex aggregation problem in a Directed Acyclic Graph. (3) With the same strategy, we prove that the reconstruction of fat digital convex sets from their horizontal and vertical X-rays can be solved in polynomial time. Fatness is a property of the digital convex sets regarding the relative position of the left, right, top and bottom points of the set. The complexity of the reconstruction of the lattice sets which are not fat remains an open question.Comment: 31 pages, 24 figure

    Dynamic reconstruction and data reconstruction for subsampled or irregularly sampled data

    Get PDF
    The Nyquist–Shannon criterion indicates the sample rate necessary to identify information with particular frequency content from a dynamical system. However, in experimental applications such as the interrogation of a flow field using particle image velocimetry (PIV), it may be impracticable or expensive to obtain data at the desired temporal resolution. To address this problem, we propose a new approach to identify temporal information from undersampled data, using ideas from modal decomposition algorithms such as dynamic mode decomposition (DMD) and optimal mode decomposition (OMD). The novel method takes a vector-valued signal, such as an ensemble of PIV snapshots, sampled at random time instances (but at sub-Nyquist rate) and projects onto a low-order subspace. Subsequently, dynamical characteristics, such as frequencies and growth rates, are approximated by iteratively approximating the flow evolution by a low-order model and solving a certain convex optimisation problem. The methodology is demonstrated on three dynamical systems, a synthetic sinusoid, the cylinder wake at Reynolds number and turbulent flow past the axisymmetric bullet-shaped body. In all cases the algorithm correctly identifies the characteristic frequencies and oscillatory structures present in the flow

    Numerical investigation of the structure effects on water transportation in PEMFC gas diffusion layers using X-ray tomography based Lattice Boltzmann method

    Get PDF
    The excessive presence of liquid water in a gas diffusion layer (GDL) hinders the access of reactant gases to the active sites of the catalyst layer leading to decreased performance of a polymer electrolyte membrane fuel cell (PEMFC). Therefore, GDLs are usually treated with a hydrophobic agent to render their fibres more hydrophobic in order to facilitate gas transport and water removal. Numerous studies have been conducted to investigate water transport in PEMFCs in recent years; however, the behaviour of liquid water in a GDL at a pore-level is poorly understood. Macroscopic models fail to incorporate the influence of the structural morphology of GDLs on liquid water transport behaviour. Experimental methods are not conducive towards a good understanding at a microscopic level because of the diminutive size of the GDLs porous structure. Alternatively, the Lattice Boltzmann (LB) method has gathered interest as it is found to be particularly useful in fluid flow simulations in porous media due to its capability to incorporate the complex boundaries of actual GDL structures. To date, most studies on fluid transport in GDLs integrated artificial structures generated by stochastic simulation techniques to the LB models. The stochastic-based model, however, does not represent closely the microscopic features of the actual GDL as manufactured. In addition, comparison of liquid water transport behaviour in different GDL structures using the LB method is rare since only a single GDL material has been utilised in most of those studies. This thesis aims to develop our understanding of liquid water transport behaviour in GDLs with morphologically different structures under varying wettability conditions based on the LB method and the X-ray computed tomography (XCT) technique. GDLs with paper and felt structures were reconstructed into 3D digital volumetric models via the XCT process. The digital models were then incorporated into a LB solver to model water saturation distribution through the GDL domains. The GDL wettability was also altered so that the effect on liquid water behaviour in the GDL could be examined. This project is divided into three main sections. In the sensitivity analysis, the effect of image resolution on gas permeability through the X-ray reconstructed GDL was carried out using a single-phase LB model. It was found that the resolution variation could significantly affect the resulting gas permeability in both principal and off-principal directions, as well as computational time. An optimum resolution, however, exists at 2.72 ”m/pixel, which consumed 400 times less computational time with less than 8% difference in the resulting permeability compared to the base resolution. This study also served as a guideline for selecting a resolution for generating the XCT images of the GDLs which were utilised in the following studies. In the structure analysis, the structures of the paper and felt GDLs were generated using the XCT and the key properties of each GDL, including thickness, porosity, permeability and tortuosity, were characterised. The thickness and the through-plane porosity distributions of each GDL were examined based on the tomography images. The resulting local through-plane porosity distributions were then used to calculate through-plane permeability and tortuosity distributions using an analytical model available in the literature. This study revealed the heterogeneity of the GDLs and how the heterogeneous nature of the GDL structures affects others properties of the GDLs. In this study, the absolute through-plane permeability and tortuosity of the X-ray-reconstructed GDL samples were also characterised using the single-phase LB model. The results from the two models were then compared and validated against data in the literature. In the water transport analysis, the two-phase LB model was employed to examine the effects of GDL structures on the behaviour of liquid water in the GDLs, including invasion patterns, saturation distribution and breakthrough behaviour under varying GDL wettability conditions. It was found that wettability was responsible for invasion patterns and water saturation levels whilst the GDL structure was mostly responsible for breakthrough occurrence and saturation distribution. It was observed that water travelled with stable displacement saturating all pores in hydrophilic GDLs, while it travelled with capillary fingering causing decreased saturation in hydrophobic GDLs, about 50% in the highly hydrophobic cases. The GDL structure was found to play a key role in breakthrough behaviour in the hydrophilic GDL as it was seen that the through-plane fibres in the felt structure and the through-plane binders in the paper structure encouraged water removal from the GDL in the thickness direction. Conversely, the GDL structure was found to have negligible influence on breakthrough in the hydrophobic GDL. Each GDL structure, however, contributed to a distinct difference in water distribution in the GDL with hydrophobic wettability. The work presented in this thesis contributes to the understanding of liquid water transport behaviour in the GDLs under the combined effects of the GDL structures and wettability conditions, which is essential for the development of effective PEMFC water management and the design of future GDL materials

    Perspectives on the Formalism of Quantum Theory

    Get PDF
    Quantum theory has the distinction among physical theories of currently underpinning most of modern physics, while remaining essentially mysterious, with no general agreement about the nature of its principles or the underlying reality. Recently, the rise of quantum information science has shown that thinking in operational or information-theoretic terms can be extremely enlightening, and that a fruitful direction for understanding quantum theory is to study it in the context of more general probabilistic theories. The framework for such theories will be reviewed in the Chapter Two. In Chapter Three we will study a property of quantum theory called self-duality, which is a correspondence between states and observables. In particular, we will show that self-duality follows from a computational primitive called bit symmetry, which states that every logical bit can be mapped to any other logical bit by a reversible transformation. In Chapter Four we will study a notion of probabilistic interference based on a hierarchy of interference-type experiments involving multiple slits. We characterize theories which do not exhibit interference in experiments with k slits, and give a simple operational interpretation. We also prove a connection between bit symmetric theories which possess certain natural transformations, and those which exhibit at most two-slit interference. In Chapter Five we will focus on reconstructing the algebraic structures of quantum theory. We will show that the closest cousins to standard quantum theory, namely the ïŹnite-dimensional Jordan-algebraic theories, can be characterized by three simple principles: (1) a generalized spectral decomposition, (2) a high degree of symmetry, and (3) a generalization of the von Neumann-Luders projection postulate. Finally, we also show that the absence of three-slit interference may be used as an alternative to the third principle. In Chapter Six, we focus on quantum statistical mechanics and the problem of understanding how its characteristic features can be derived from an exact treatment of the underlying quantum system. Our central assumptions are sufficiently complex dynamics encoded as a condition on the complexity of the eigenvectors of the Hamiltonian, and an information theoretic restriction on measurement resources. We show that for almost all Hamiltonian systems measurement outcome probabilities are indistinguishable from the uniform distribution

    A quasi-real-time inertialess microwave holographic imaging system

    Get PDF
    This thesis records the theoretical analysis and hardware development of a laboratory microwave imaging system which uses holographic principles. The application of an aperture synthesis technique and the electronic commutation of all antennae has resulted in a compact and economic assembly - which requires no moving parts and which, consequently, has a high field mapping speed potential. The relationship of this microwave holographic system to other established techniques is examined theoretically and the performance of the imaging system is demonstrated using conventional optically- and numerically-based reconstruction of the measured holograms. The high mapping speed potential of this system has allowed the exploitation of an imaging mode not usually associated with microwave holography. In particular, a certain antenna array specification leads to a versatile imaging system which corresponds closely in the laboratory scale to the widely used synthetic aperture radar principle. It is envisaged that the microwave holographic implementation of this latter principle be used as laboratory instrumentation in the elucidation of the interaction of hydrodynamic and electromagnetic waves. Some simple demonstrations of this application have been presented, and the concluding chapter also describes a suitable hardware specification. This thesis has also emphasised the hardware details of the imaging system since the development of the microwave and other electronic components represented a substantial part of this research and because the potential applications of the imaging principle have been found to be intimately linked to the tolerances of the various microwave components. Bibliography: pages 122-132

    Towards Scalable Characterization of Noisy, Intermediate-Scale Quantum Information Processors

    Get PDF
    In recent years, quantum information processors (QIPs) have grown from one or two qubits to tens of qubits. As a result, characterizing QIPs – measuring how well they work, and how they fail – has become much more challenging. The obstacles to characterizing today’s QIPs will grow even more difficult as QIPs grow from tens of qubits to hundreds, and enter what has been called the “noisy, intermediate-scale quantum” (NISQ) era. This thesis develops methods based on advanced statistics and machine learning algorithms to address the difficulties of “quantum character- ization, validation, and verification” (QCVV) of NISQ processors. In the first part of this thesis, I use statistical model selection to develop techniques for choosing between several models for a QIPs behavior. In the second part, I deploy machine learning algorithms to develop a new QCVV technique and to do experiment design. These investigations help lay a foundation for extending QCVV to characterize the next generation of NISQ processors

    Author index to volumes 301–400

    Get PDF

    A total hip replacement toolbox : from CT-scan to patient-specific FE analysis

    Get PDF

    Quantum-classical generative models for machine learning

    Get PDF
    The combination of quantum and classical computational resources towards more effective algorithms is one of the most promising research directions in computer science. In such a hybrid framework, existing quantum computers can be used to their fullest extent and for practical applications. Generative modeling is one of the applications that could benefit the most, either by speeding up the underlying sampling methods or by unlocking more general models. In this work, we design a number of hybrid generative models and validate them on real hardware and datasets. The quantum-assisted Boltzmann machine is trained to generate realistic artificial images on quantum annealers. Several challenges in state-of-the-art annealers shall be overcome before one can assess their actual performance. We attack some of the most pressing challenges such as the sparse qubit-to-qubit connectivity, the unknown effective-temperature, and the noise on the control parameters. In order to handle datasets of realistic size and complexity, we include latent variables and obtain a more general model called the quantum-assisted Helmholtz machine. In the context of gate-based computers, the quantum circuit Born machine is trained to encode a target probability distribution in the wavefunction of a set of qubits. We implement this model on a trapped ion computer using low-depth circuits and native gates. We use the generative modeling performance on the canonical Bars-and-Stripes dataset to design a benchmark for hybrid systems. It is reasonable to expect that quantum data, i.e., datasets of wavefunctions, will become available in the future. We derive a quantum generative adversarial network that works with quantum data. Here, two circuits are optimized in tandem: one tries to generate suitable quantum states, the other tries to distinguish between target and generated states

    Quantum metrology with nonclassical states of atomic ensembles

    Full text link
    Quantum technologies exploit entanglement to revolutionize computing, measurements, and communications. This has stimulated the research in different areas of physics to engineer and manipulate fragile many-particle entangled states. Progress has been particularly rapid for atoms. Thanks to the large and tunable nonlinearities and the well developed techniques for trapping, controlling and counting, many groundbreaking experiments have demonstrated the generation of entangled states of trapped ions, cold and ultracold gases of neutral atoms. Moreover, atoms can couple strongly to external forces and light fields, which makes them ideal for ultra-precise sensing and time keeping. All these factors call for generating non-classical atomic states designed for phase estimation in atomic clocks and atom interferometers, exploiting many-body entanglement to increase the sensitivity of precision measurements. The goal of this article is to review and illustrate the theory and the experiments with atomic ensembles that have demonstrated many-particle entanglement and quantum-enhanced metrology.Comment: 76 pages, 40 figures, 1 table, 603 references. Some figures bitmapped at 300 dpi to reduce file siz
    corecore