66 research outputs found
In-Situ Dual-Port Polarization Contrast Imaging of Faraday Rotation in a High Optical Depth Ultracold 87Rb Atomic Ensemble
We study the effects of high optical depth and density on the performance of
a light-atom quantum interface. An in-situ imaging method, a dual-port
polarization contrast technique, is presented. This technique is able to
compensate for image distortions due to refraction. We propose our imaging
method as a tool to characterize atomic ensembles for high capacity spatial
multimode quantum memories. Ultracold dense inhomogeneous Rubidium samples are
imaged and we find a resonant optical depth as high as 680 on the D1 line. The
measurements are compared with light-atom interaction models based on
Maxwell-Bloch equations. We find that an independent atom assumption is
insufficient to explain our data and present corrections due to resonant
dipole-dipole interactions
The effect of light assisted collisions on matter wave coherence in superradiant Bose-Einstein condensates
We investigate experimentally the effects of light assisted collisions on the
coherence between momentum states in Bose-Einstein condensates. The onset of
superradiant Rayleigh scattering serves as a sensitive monitor for matter wave
coherence. A subtle interplay of binary and collective effects leads to a
profound asymmetry between the two sides of the atomic resonance and provides
far bigger coherence loss rates for a condensate bathed in blue detuned light
than previously estimated. We present a simplified quantitative model
containing the essential physics to explain our experimental data and point at
a new experimental route to study strongly coupled light matter systems.Comment: 10 pages, 4 figure
Ancient Roman coin retrieval : a systematic examination of the effects of coin grade
Ancient coins are historical artefacts of great significance which attract the interest of scholars, and a large and growing number of amateur collectors. Computer vision based analysis and retrieval of ancient coins holds much promise in this realm, and has been the subject of an increasing amount of research. The present work is in great part motivated by the lack of systematic evaluation of the existing methods in the context of coin grade which is one of the key challenges both to humans and automatic methods. We describe a series of methods – some being adopted from previous work and others as extensions thereof – and perform the first thorough analysis to date.Postprin
Contributions for the International Carbon Conference : CARBONE 84
This report is the compilation of some papers prepared by KFA Jülich GmbH for the International Carbon Conference CARBONE 84, which will be held at Bordeaux/France. The presentations deal with objectives of manufacture as well as technical and nuclear applications of carbeneous materials
Human Centric Facial Expression Recognition
Facial expression recognition (FER) is an area of active research, both in computer science and in behavioural science. Across these domains there is evidence to suggest that humans and machines find it easier to recognise certain emotions, for example happiness, in comparison to others. Recent behavioural studies have explored human perceptions of emotion further, by evaluating the relative contribution of features in the face when evaluating human sensitivity to emotion. It has been identified that certain facial regions have more salient features for certain expressions of emotion, especially when emotions are subtle in nature. For example, it is easier to detect fearful expressions when the eyes are expressive. Using this observation as a starting point for analysis, we similarly examine the effectiveness with which knowledge of facial feature saliency may be integrated into current approaches to automated FER. Specifically, we compare and evaluate the accuracy of ‘full-face’ versus upper and lower facial area convolutional neural network (CNN) modelling for emotion recognition in static images, and propose a human centric CNN hierarchy which uses regional image inputs to leverage current understanding of how humans recognise emotions across the face. Evaluations using the CK+ dataset demonstrate that our hierarchy can enhance classification accuracy in comparison to individual CNN architectures, achieving overall true positive classification in 93.3% of cases
Linking land cover and species distribution models to project potential ranges of malaria vectors: an example using Anopheles arabiensis in Sudan and Upper Egypt
Satellite Ocean Colour: Current Status and Future Perspective
Spectrally resolved water-leaving radiances (ocean colour) and inferred chlorophyll concentration are key to studying phytoplankton dynamics at seasonal and interannual scales, for a better understanding of the role of phytoplankton in marine biogeochemistry; the global carbon cycle; and the response of marine ecosystems to climate variability, change and feedback processes. Ocean colour data also have
a critical role in operational observation systems monitoring coastal eutrophication, harmful algal blooms, and sediment plumes. The contiguous ocean-colour record reached 21 years in 2018; however, it is comprised of a number of one-off missions such that creating a consistent time-series of ocean-colour data requires merging of the individual sensors (including MERIS, Aqua-MODIS, SeaWiFS, VIIRS, and OLCI) with differing sensor characteristics, without introducing artefacts. By contrast, the next decade will see consistent observations from operational ocean colour series
with sensors of similar design and with a replacement strategy. Also, by 2029 the record will start to be of sufficient duration to discriminate climate change impacts from natural variability, at least in some regions. This paper describes the current status and future prospects in the field of ocean colour focusing on large to medium resolution observations of oceans and coastal seas. It reviews the user requirements in terms of products and uncertainty characteristics and then describes features of current and future satellite ocean-colour sensors, both operational and innovative. The key role of in situ validation and calibration is highlighted as are ground segments that process the data received from the ocean-colour sensors and deliver analysis-ready products to end-users. Example applications of the ocean-colour data are presented, focusing on the climate data record and operational applications including water quality and
assimilation into numerical models. Current capacity building and training activities pertinent to ocean colour are described and finally a summary of future perspectives
is provided
A compilation of global bio-optical in situ data for ocean colour satellite applications – version three
A global in situ data set for validation of ocean colour products from the ESA Ocean Colour Climate
Change Initiative (OC-CCI) is presented. This version of the compilation, starting in 1997, now extends to
2021, which is important for the validation of the most recent satellite optical sensors such as Sentinel 3B
OLCI and NOAA-20 VIIRS. The data set comprises in situ observations of the following variables: spectral remote-sensing reflectance, concentration of chlorophyll-a, spectral inherent optical properties, spectral diffuse
attenuation coefficient, and total suspended matter. Data were obtained from multi-project archives acquired via
open internet services or from individual projects acquired directly from data providers. Methodologies were
implemented for homogenization, quality control, and merging of all data. Minimal changes were made on the
original data, other than conversion to a standard format, elimination of some points, after quality control and
averaging of observations that were close in time and space. The result is a merged table available in text format.
Overall, the size of the data set grew with 148 432 rows, with each row representing a unique station in space
and time (cf. 136 250 rows in previous version; Valente et al., 2019). Observations of remote-sensing reflectance
increased to 68 641 (cf. 59 781 in previous version; Valente et al., 2019). There was also a near tenfold increase
in chlorophyll data since 2016. Metadata of each in situ measurement (original source, cruise or experiment,
principal investigator) are included in the final table. By making the metadata available, provenance is better
documented and it is also possible to analyse each set of data separately. The compiled data are available at
https://doi.org/10.1594/PANGAEA.941318 (Valente et al., 2022)
A collaborative artefact reconstruction environment
A novel collaborative artefact reconstruction environment design is presented that is informed by experimental task observation and participatory design. The motivation for the design was to enable collaborative human and computer effort in the reconstruction of fragmented cuneiform tablets: millennia-old clay tablets used for written communication in early human civilisation. Thousands of joining cuneiform tablet fragments are distributed within and between worldwide collections. The reconstruction of the tablets poses a complex 3D jigsaw puzzle with no physically tractable solution. In reconstruction experiments, participants collaborated synchronously and asynchronously on virtual and physical reconstruction tasks. Results are presented that demonstrate the difficulties experienced by human reconstructors in virtual tasks compared to physical tasks. Unlike computer counterparts, humans have difficulty identifying joins in virtual environments but, unlike computers, humans are averse to making incorrect joins. A successful reconstruction environment would marry the opposing strengths and weaknesses of humans and computers, and provide tools to support the communications and interactions of successful physical performance, in the virtual setting. The paper presents a taxonomy of the communications and interactions observed in successful physical and synchronous collaborative reconstruction tasks. Tools for the support of these communications and interactions were successfully incorporated in the “i3D” virtual environment design presented
- …