8,428 research outputs found
Nanoscale magnetometry using a single spin system in diamond
We propose a protocol to estimate magnetic fields using a single
nitrogen-vacancy (N-V) center in diamond, where the estimate precision scales
inversely with time, ~1/T$, rather than the square-root of time. The method is
based on converting the task of magnetometry into phase estimation, performing
quantum phase estimation on a single N-V nuclear spin using either adaptive or
nonadaptive feedback control, and the recently demonstrated capability to
perform single-shot readout within the N-V [P. Neumann et. al., Science 329,
542 (2010)]. We present numerical simulations to show that our method provides
an estimate whose precision scales close to ~1/T (T is the total estimation
time), and moreover will give an unambiguous estimate of the static magnetic
field experienced by the N-V. By combining this protocol with recent proposals
for scanning magnetometry using an N-V, our protocol will provide a significant
decrease in signal acquisition time while providing an unambiguous spatial map
of the magnetic field.Comment: 8 pages and 5 figure
A systematic approach to atomicity decomposition in Event-B
Event-B is a state-based formal method that supports a refinement process in which an abstract model is elaborated towards an implementation in a step-wise manner. One weakness of Event-B is that control flow between events is typically modelled implicitly via variables and event guards. While this fits well with Event-B refinement, it can make models involving sequencing of events more difficult to specify and understand than if control flow was explicitly specified. New events may be introduced in Event-B refinement and these are often used to decompose the atomicity of an abstract event into a series of steps. A second weakness of Event-B is that there is no explicit link between such new events that represent a step in the decomposition of atomicity and the abstract event to which they contribute. To address these weaknesses, atomicity decomposition diagrams support the explicit modelling of control flow and refinement relationships for new events. In previous work,the atomicity decomposition approach has been evaluated manually in the development of two large case studies, a multi media protocol and a spacecraft sub-system. The evaluation results helped us to develop a systematic definition of the atomicity decomposition approach, and to develop a tool supporting the approach. In this paper we outline this systematic definition of the approach, the tool that supports it and evaluate the contribution that the tool makes
Robust control of entanglement in a Nitrogen-vacancy centre coupled to a Carbon-13 nuclear spin in diamond
We address a problem of generating a robust entangling gate between
electronic and nuclear spins in the system of a single nitrogen-vacany centre
coupled to a nearest Carbon-13 atom in diamond against certain types of
systematic errors such as pulse-length and off-resonance errors. We analyse the
robustness of various control schemes: sequential pulses, composite pulses and
numerically-optimised pulses. We find that numerically-optimised pulses,
produced by the gradient ascent pulse engineering algorithm (GRAPE), are more
robust than the composite pulses and the sequential pulses. The optimised
pulses can also be implemented in a faster time than the composite pulses.Comment: 15 pages, 5 figure
Modelling the Combustion of Explosives
When an explosive burns, gaseous products are formed as a result. The interaction of the burning solid and gas is
not well understood. More specifically, the process of the gaseous product heating the explosive is yet to be explored
in detail. The present work is aimed towards filling that gap using mathematical modelling: this aims to track the
temperature profile in the explosive and the gas’s response.
This work begins by modelling single step reactions using the simple Arrhenius model. An alternative asymptotic
approach is also employed. There is close agreement between the results for the full reaction-diffusion problem and
the asymptotic problem. The model is then extended to include three step reaction kinetics, where we again apply
asymptotic analysis in addition to direct computation. Further work outlined briefly at the end includes the motion
of gas being incorporated in the existing model with temperature and pressure distributions considered
X-ray Synthesis Based on Triangular Mesh Models Using GPU-Accelerated Ray Tracing for Multi-modal Breast Image Registration
For image registration of breast MRI and X-ray mammography we apply detailed biomechanical models. Synthesizing X-ray mammograms from these models is an important processing step for optimizing registration parameters and deriving images for multi-modal diagnosis. A fast computation time for creating synthetic images is essential to enable a clinically relevant application. In this paper we present a method to create synthetic X-ray attenuation images with an hardware-optimized ray tracing algorithm on recent graphics processing units’ (GPU) ray tracing (RT) cores. The ray tracing algorithm is able to calculate the attenuation of the X-rays by tracing through a triangular polygon-mesh. We use the Vulkan API, which enables access to RT cores. One frame for a triangle mesh with over 5 million triangles in the mesh and a detector resolution of 1080×1080 can be calculated and transferred to and from the GPU in about 0.76 s on NVidia RTX 2070 Super GPU. Calculation duration of an interactive application without the transfer overhead allows real time application with more than 30 frames per second (fps) even for very large polygon models. The presented method is able to calculate synthetic X-ray images in a short time and has the potential for real-time applications. Also it is the very first implementation using RT cores for this purpose. The toolbox will be available as an open source
An analysis on compensation of claims regarding to personal Injury and loss of earning on several court cases
Road accident is a major contributor in personal injury cases. The plaintiff or accident victims are entitled to compensation from injuries. This study aims to analyse the amount of damages received compared to the amount of damages in personal injury guideline from Completion of the Review of the Compendium of Personal Injury Award. A comparison between the system multiplier set forth in Section 28A of the Civil Law Act (Amendment) Act 1984 will be carried out with Odgen Table from United Kingdom customised with the Expected Life Tables of Malaysians. A total of 30 court cases from 1989 to 2013 are analysed in this study which includes all accidents on the road. The results showed that there were two cases of injury beyond the maximum range of the guidelines which are scars and eye injuries. Therefore, it is suggested that we should look at multiplier which is fairer in dealing with loss of earnings.Keywords: quantum; loss of earnings; multiplier
Refining Nodes and Edges of State Machines
State machines are hierarchical automata that are widely used to structure complex behavioural specifications. We develop two notions of refinement of state machines, node refinement and edge refinement. We compare the two notions by means of examples and argue that, by adopting simple conventions, they can be combined into one method of refinement. In the combined method, node refinement can be used to develop architectural aspects of a model and edge refinement to develop algorithmic aspects. The two notions of refinement are grounded in previous work. Event-B is used as the foundation for our refinement theory and UML-B state machine refinement influences the style of node refinement. Hence we propose a method with direct proof of state machine refinement avoiding the detour via Event-B that is needed by UML-B
- …