3,874 research outputs found
Global and regional importance of the direct dust-climate feedback.
Feedbacks between the global dust cycle and the climate system might have amplified past climate changes. Yet, it remains unclear what role the dust-climate feedback will play in future anthropogenic climate change. Here, we estimate the direct dust-climate feedback, arising from changes in the dust direct radiative effect (DRE), using a simple theoretical framework that combines constraints on the dust DRE with a series of climate model results. We find that the direct dust-climate feedback is likely in the range of -0.04 to +0.02 Wm -2 K-1, such that it could account for a substantial fraction of the total aerosol feedbacks in the climate system. On a regional scale, the direct dust-climate feedback is enhanced by approximately an order of magnitude close to major source regions. This suggests that it could play an important role in shaping the future climates of Northern Africa, the Sahel, the Mediterranean region, the Middle East, and Central Asia
Optimal design of nonuniform FIR transmultiplexer using semi-infinite programming
This paper considers an optimum nonuniform FIR transmultiplexer design problem subject to specifications in the frequency domain. Our objective is to minimize the sum of the ripple energy for all the individual filters, subject to the specifications on amplitude and aliasing distortions, and to the passband and stopband specifications for the individual filters. This optimum nonuniform transmultiplexer design problem can be formulated as a quadratic semi-infinite programming problem. The dual parametrization algorithm is extended to this nonuniform transmultiplexer design problem. If the lengths of the filters are sufficiently long and the set of decimation integers is compatible, then a solution exists. Since the problem is formulated as a convex problem, if a solution exists, then the solution obtained is unique and the local solution is a global minimum
The role of entanglement in calibrating optical quantum gyroscopes
We consider the calibration of an optical quantum gyroscope by modeling two Sagnac interferometers, mounted approximately at right angles to each other. Reliable operation requires that we know the angle between the interferometers with high precision, and we show that a procedure akin to multi-position testing in inertial navigation systems can be generalized to the case of quantum interferometry. We find that while entanglement is a key resource within an individual Sagnac interferometer, its presence between the interferometers is a far more complicated story. The optimum level of entanglement depends strongly on the sought parameter values, and small but significant improvements may be gained from choosing states with the optimal amount of entanglement between the interferometers
Reaching micro-arcsecond astrometry with long baseline optical interferometry; application to the GRAVITY instrument
A basic principle of long baseline interferometry is that an optical path
difference (OPD) directly translates into an astrometric measurement. In the
simplest case, the OPD is equal to the scalar product between the vector
linking the two telescopes and the normalized vector pointing toward the star.
However, a too simple interpretation of this scalar product leads to seemingly
conflicting results, called here "the baseline paradox". For micro-arcsecond
accuracy astrometry, we have to model in full the metrology measurement. It
involves a complex system subject to many optical effects: from pure baseline
errors to static, quasi-static and high order optical aberrations. The goal of
this paper is to present the strategy used by the "General Relativity Analysis
via VLT InTerferometrY" instrument (GRAVITY) to minimize the biases introduced
by these defects. It is possible to give an analytical formula on how the
baselines and tip-tilt errors affect the astrometric measurement. This formula
depends on the limit-points of three type of baselines: the wide-angle
baseline, the narrow-angle baseline, and the imaging baseline. We also,
numerically, include non-common path higher-order aberrations, whose amplitude
were measured during technical time at the Very Large Telescope Interferometer.
We end by simulating the influence of high-order common-path aberrations due to
atmospheric residuals calculated from a Monte-Carlo simulation tool for
Adaptive optics systems. The result of this work is an error budget of the
biases caused by the multiple optical imperfections, including optical
dispersion. We show that the beam stabilization through both focal and pupil
tracking is crucial to the GRAVITY system. Assuming the instrument pupil is
stabilized at a 4 cm level on M1, and a field tracking below 0.2, we
show that GRAVITY will be able to reach its objective of 10as accuracy.Comment: 14 pages. Accepted by A&
The interferometric baselines and GRAVITY astrometric error budget
GRAVITY is a new generation beam combination instrument for the VLTI. Its
goal is to achieve microarsecond astrometric accuracy between objects separated
by a few arcsec. This accuracy on astrometric measurements is the most
important challenge of the instrument, and careful error budget have been
paramount during the technical design of the instrument. In this poster, we
will focus on baselines induced errors, which is part of a larger error budget.Comment: SPIE Meeting 2014 -- Montrea
A miniature sensor for electrical field measurements in dusty planetary atmospheres
"Dusty phenomena such as regular wind-blown dust, dust storms, and dust devils are the most important, currently active, geological processes on Mars. Electric fields larger than 100 kV/m have been measured in terrestrial dusty phenomena. Theoretical calculations predict that, close to the surface, the bulk electric fields in martian dusty phenomena reach the breakdown value of the isolating properties of thin martian air of about a few 10 kV/m. The fact that martian dusty phenomena are electrically active has important implications for dust lifting and atmospheric chemistry. Electric field sensors are usually grounded and distort the electric fields in their vicinity. Grounded sensors also produce large errors when subject to ion currents or impacts from clouds of charged particles. Moreover, they are incapable of providing information about the direction of the electric field, an important quantity. Finally, typical sensors with more than 10 cm of diameter are not capable of measuring electric fields at distances as small as a few cm from the surface. Measurements this close to the surface are necessary for studies of the effects of electric fields on dust lifting. To overcome these shortcomings, we developed the miniature electric-field sensor described in this article."http://deepblue.lib.umich.edu/bitstream/2027.42/64202/1/jpconf8_142_012075.pd
Validation of the performance of a GMO multiplex screening assay based on microarray detection
A new screening method for the detection and identification of GMO, based on the use of multiplex PCR followed by microarray, has been developed and is presented. The technology is based on the identification of quite ubiquitous GMO genetic target elements first amplified by PCR, followed by direct hybridisation of the amplicons on a predefined microarray (DualChip® GMO, Eppendorf, Germany). The validation was performed within the framework of a European project (Co-Extra, contract no 007158) and in collaboration with 12 laboratories specialised in GMO detection. The present study reports the strategy and the results of an ISO complying validation of the method carried out through an inter-laboratory study. Sets of blind samples were provided consisting of DNA reference materials covering all the elements detectable by specific probes present on the array. The GMO concentrations varied from 1% down to 0.045%. In addition, a mixture of two GMO events (0.1% RRS diluted in 100% TOPAS19/2) was incorporated in the study to test the robustness of the assay in extreme conditions. Data were processed according to ISO 5725 standard. The method was evaluated with predefined performance criteria with respect to the EC CRL method acceptance criteria. The overall method performance met the acceptance criteria; in particular, the results showed that the method is suitable for the detection of the different target elements at 0.1% concentration of GMO with a 95% accuracy rate. This collaborative trial showed that the method can be considered as fit for the purpose of screening with respect to its intra- and inter-laboratory accuracy. The results demonstrated the validity of combining multiplex PCR with array detection as provided by the DualChip® GMO (Eppendorf, Germany) for the screening of GMO. The results showed that the technology is robust, practical and suitable as a screening too
Agent-based homeostatic control for green energy in the smart grid
With dwindling non-renewable energy reserves and the adverse effects of climate change, the development of the smart electricity grid is seen as key to solving global energy security issues and to reducing carbon emissions. In this respect, there is a growing need to integrate renewable (or green) energy sources in the grid. However, the intermittency of these energy sources requires that demand must also be made more responsive to changes in supply, and a number of smart grid technologies are being developed, such as high-capacity batteries and smart meters for the home, to enable consumers to be more responsive to conditions on the grid in real-time. Traditional solutions based on these technologies, however, tend to ignore the fact that individual consumers will behave in such a way that best satisfies their own preferences to use or store energy (as opposed to that of the supplier or the grid operator). Hence, in practice, it is unclear how these solutions will cope with large numbers of consumers using their devices in this way. Against this background, in this paper, we develop novel control mechanisms based on the use of autonomous agents to better incorporate consumer preferences in managing demand. These agents, residing on consumers' smart meters, can both communicate with the grid and optimise their owner's energy consumption to satisfy their preferences. More specifically, we provide a novel control mechanism that models and controls a system comprising of a green energy supplier operating within the grid and a number of individual homes (each possibly owning a storage device). This control mechanism is based on the concept of homeostasis whereby control signals are sent to individual components of a system, based on their continuous feedback, in order to change their state so that the system may reach a stable equilibrium. Thus, we define a new carbon-based pricing mechanism for this green energy supplier that takes advantage of carbon-intensity signals available on the internet in order to provide real-time pricing. The pricing scheme is designed in such a way that it can be readily implemented using existing communication technologies and is easily understandable by consumers. Building upon this, we develop new control signals that the supplier can use to incentivise agents to shift demand (using their storage device) to times when green energy is available. Moreover, we show how these signals can be adapted according to changes in supply and to various degrees of penetration of storage in the system. We empirically evaluate our system and show that, when all homes are equipped with storage devices, the supplier can significantly reduce its reliance on other carbon-emitting power sources to cater for its own shortfalls. By so doing, the supplier reduces the carbon emission of the system by up to 25% while the consumer reduces its costs by up to 14.5%. Finally, we demonstrate that our homeostatic control mechanism is not sensitive to small prediction errors and the supplier is incentivised to accurately predict its green production to minimise costs
Continuous-Variable Quantum Key Distribution using Thermal States
We consider the security of continuous-variable quantum key distribution
using thermal (or noisy) Gaussian resource states. Specifically, we analyze
this against collective Gaussian attacks using direct and reverse
reconciliation where both protocols use either homodyne or heterodyne
detection. We show that in the case of direct reconciliation with heterodyne
detection, an improved robustness to channel noise is achieved when large
amounts of preparation noise is added, as compared to the case when no
preparation noise is added. We also consider the theoretical limit of infinite
preparation noise and show a secure key can still be achieved in this limit
provided the channel noise is less than the preparation noise. Finally, we
consider the security of quantum key distribution at various electromagnetic
wavelengths and derive an upper bound related to an entanglement-breaking
eavesdropping attack and discuss the feasibility of microwave quantum key
distribution.Comment: 12 pages, 11 figures. Updated from published version with some minor
correction
- …