468 research outputs found
Computing Functions of Random Variables via Reproducing Kernel Hilbert Space Representations
We describe a method to perform functional operations on probability
distributions of random variables. The method uses reproducing kernel Hilbert
space representations of probability distributions, and it is applicable to all
operations which can be applied to points drawn from the respective
distributions. We refer to our approach as {\em kernel probabilistic
programming}. We illustrate it on synthetic data, and show how it can be used
for nonparametric structural equation models, with an application to causal
inference
Vision and revision: wavefront sensing from the image domain
An ideal telescope with no optical aberrations can achieve a resolution and contrast limited by the wave nature of light, such that the finest detail that can be resolved is of the order of the angle subtended by one wavelength over the diameter of the telescope. For telescopes operating close to this ideal case, however, it is rare that the full performance of the diffraction limit is achieved, as small optical imperfections cause speckles to appear in the image. These are difficult to calibrate, as they are often caused by thermal and mechanical variations in the optical path which vary slowly with time. The quasi-static speckles that they impose can mimic the real signal of a faint star or planet orbiting the primary target, and these therefore impose the principal limitation on the angular resolution and contrast of instruments designed to detect exoplanets and faint companions. These aberrations can be corrected by active optics, where a wavefront sensor is used to used to reconstruct a map of the distortions which can then be compensated for by a deformable mirror, but there is a problem with this also: differrential aberrations between the wavefront sensor and science camera are not detected. In this thesis, I will discuss a successful laboratory implementation of a recently-proposed technique for reconstructing a wavefront map using only the image taken with the science camera, which can be used to calibrate this non-common path error. This approach, known as the asymmetric pupil Fourier wavefront sensor, requires that the pupil not be centrosymmetric, which is easily achieved with a mask, with segment tilting, or with judiciously placed spiders to support the secondary mirror, and represents a promising way forward for characterizing and correcting segment misalignments on future missions such as the James Webb Space Telescope
Proceedings of the 2018 Joint Workshop of Fraunhofer IOSB and Institute for Anthropomatics, Vision and Fusion Laboratory
The Proceeding of the annual joint workshop of the Fraunhofer IOSB and the Vision and Fusion
Laboratory (IES) 2018 of the KIT contain technical reports of the PhD-stundents on the status of their
research. The discussed topics ranging from computer vision and optical
metrology to network security and machine learning.
This volume provides a comprehensive and up-to-date overview of the research program of the IES
Laboratory and the Fraunhofer IOSB
GREGOR Fabry-Perot Interferometer - status report and prospects
The GREGOR Fabry-Perot Interferometer (GFPI) is one of three first-light
instruments of the German 1.5-meter GREGOR solar telescope at the Observatorio
del Teide, Tenerife, Spain. The GFPI allows fast narrow-band imaging and
post-factum image restoration. The retrieved physical parameters will be a
fundamental building block for understanding the dynamic Sun and its magnetic
field at spatial scales down to 50 km on the solar surface. The GFPI is a
tunable dual-etalon system in a collimated mounting. It is designed for
spectropolarimetric observations over the wavelength range from 530-860 nm with
a theoretical spectral resolution of R ~ 250,000. The GFPI is equipped with a
full-Stokes polarimeter. Large-format, high-cadence CCD detectors with powerful
computer hard- and software enable the scanning of spectral lines in time spans
equivalent to the evolution time of solar features. The field-of-view of 50" x
38" covers a significant fraction of the typical area of active regions. We
present the main characteristics of the GFPI including advanced and automated
calibration and observing procedures. We discuss improvements in the optical
design of the instrument and show first observational results. Finally, we lay
out first concrete ideas for the integration of a second FPI, the Blue Imaging
Solar Spectrometer, which will explore the blue spectral region below 530 nm.Comment: 18 pages, 9 Figures, 4 Tables, "Astronomical Telescopes and
Instrumentation", Amsterdam, 1-6 July 2012, SPIE Proc. 8446-276, in pres
Project Tech Top study of lunar, planetary and solar topography Final report
Data acquisition techniques for information on lunar, planetary, and solar topograph
Recommended from our members
Acoustic Particle-Image Velocimetry: development and applications
Particle Image Velocimetry (PIV) is a non-intrusive technique for simultaneously measuring the velocities at many points in a fluid flow. The fluid is seeded with tracer particles and the region under investigation is illuminated. An image of the illuminated region is captured and then, a short time period later, a second image is taken. Suitable analysis of these images yields an instantaneous velocity vector map.
Until recently, restrictions in the rate at which images could be captured have limited the PIV technique to the analysis of slow flows. However, advances in camera technology have now opened up the possibility of using PIV in the analysis of faster flows. Indeed, image capture rates are now fast enough to enable two images to be captured during a fraction of an acoustic cycle, indicating the potential for using PIV to analyse sound fields.
In this thesis, after some aspects of sound field theory have been outlined and following a discussion of the theory of PIV, the development of experimental PIV apparatus for measuring sound fields is described. Measurements of the temporal variation in the velocities of particles within some common sound fields are presented. In particular, the passage of an acoustic pulse is monitored and the sinusoidal motion of particles in a resonating tube is recorded yielding the corresponding standing wave pattern. Finally, the main limitations of the PIV technique when applied to acoustic fields are discussed
Digital Image Processing
Newspapers and the popular scientific press today publish many examples of highly impressive images. These images range, for example, from those showing regions of star birth in the distant Universe to the extent of the stratospheric ozone depletion over Antarctica in springtime, and to those regions of the human brain affected by Alzheimer’s disease. Processed digitally to generate spectacular images, often in false colour, they all make an immediate and deep impact on the viewer’s imagination and understanding.
Professor Jonathan Blackledge’s erudite but very useful new treatise Digital Image Processing: Mathematical and Computational Methods explains both the underlying theory and the techniques used to produce such images in considerable detail. It also provides many valuable example problems - and their solutions - so that the reader can test his/her grasp of the physical, mathematical and numerical aspects of the particular topics and methods discussed. As such, this magnum opus complements the author’s earlier work Digital Signal Processing. Both books are a wonderful resource for students who wish to make their careers in this fascinating and rapidly developing field which has an ever increasing number of areas of application.
The strengths of this large book lie in: • excellent explanatory introduction to the subject; • thorough treatment of the theoretical foundations, dealing with both electromagnetic and acoustic wave scattering and allied techniques; • comprehensive discussion of all the basic principles, the mathematical transforms (e.g. the Fourier and Radon transforms), their interrelationships and, in particular, Born scattering theory and its application to imaging systems modelling; discussion in detail - including the assumptions and limitations - of optical imaging, seismic imaging, medical imaging (using ultrasound), X-ray computer aided tomography, tomography when the wavelength of the probing radiation is of the same order as the dimensions of the scatterer, Synthetic Aperture Radar (airborne or spaceborne), digital watermarking and holography; detail devoted to the methods of implementation of the analytical schemes in various case studies and also as numerical packages (especially in C/C++); • coverage of deconvolution, de-blurring (or sharpening) an image, maximum entropy techniques, Bayesian estimators, techniques for enhancing the dynamic range of an image, methods of filtering images and techniques for noise reduction; • discussion of thresholding, techniques for detecting edges in an image and for contrast stretching, stochastic scattering (random walk models) and models for characterizing an image statistically; • investigation of fractal images, fractal dimension segmentation, image texture, the coding and storing of large quantities of data, and image compression such as JPEG; • valuable summary of the important results obtained in each Chapter given at its end; • suggestions for further reading at the end of each Chapter. I warmly commend this text to all readers, and trust that they will find it to be invaluable.
Professor Michael J Rycroft Visiting Professor at the International Space University, Strasbourg, France, and at Cranfield University, England
Recommended from our members
Computer-Generated Holography for Areal Additive Manufacture
With a market of approximately $10B, additive manufacture (AM) is an exciting next-generation technology with the promise of significant environmental and societal impact. AM promises to help reduce emissions and waste during manufacture while improving sustainability. Widely used in applications from hip implants to jet engines, AM remains the domain of experts due to the material and thermal challenges encountered.
AM in metals is dominated by Laser Powder Based Fusion (L-PBF). Powder is spread in layers 10s of microns thick and selectively melted by scanning a small laser spot heat source over the bed.
Traditional AM systems have limited ability to manage or compensate for heat generated. The rapidly moving heat source spot results in high thermal cycling and is a major influence on residual stress and distortion. Mechanical limitations in the galvoscanner mean that over or under-heating is common and can lead to voids, boiling and spatter. The scale difference between the part size and the spot size means that predictive modelling is beyond the scope of even today’s best computing clusters. These factors have led to frequent inability to ensure part quality without physical prototyping and destructive testing.
This thesis sets out initial research into creating a radically new AM process that uses computer-generated holography (CGH) to produce complex light patterns in a single pulse. Projecting power to the whole layer at once will mean that the thermal properties of the powders before and after writing can be factored into the processed hologram and part design. It will also significantly reduce thermal gradients and melt-pool instability.
The fields of additive manufacture and computer-generated holography are introduced in Chapter 1. Chapters 2 and 3 then provide more detail on CGH and AM modelling respectively. The first deliverable, a reusable software package capable of generating holograms, is presented in Chapter 4. Algorithms developed for the project are introduced in Chapter 4.3. The first project demonstrator, an AM machine capable of printing in resins using holographic projection is discussed in Section 6.2. This shows performance comparable to modern 3D printing machines and highlights the applicability of computer-generated holography to areal processes. Section 6.3 then discusses the ongoing development of a metal powder demonstrator. As this PhD forms the first stage of a larger project, only preliminary work on the powder demonstrator is discussed. Chapter 7 then draws conclusions and outlines the way forward for future research.
The thesis appendices then discuss an in-depth discussion of algorithm performances in Appendices A and B. Appendices C and D then discuss digressions into the implementation. Appendices E and F present a laser induced damage threshold (LIDT) measurement system developed. Finally, Appendices G and H provide more detail on the software developed and Appendix I gives links to additional project resources.EP/T008369/1;
EP/L016567/1;
EP/V055003/
- …