396 research outputs found

    Optimal photonic indistinguishability tests in multimode networks

    Get PDF
    Particle indistinguishability is at the heart of quantum statistics that regulates fundamental phenomena such as the electronic band structure of solids, Bose-Einstein condensation and superconductivity. Moreover, it is necessary in practical applications such as linear optical quantum computation and simulation, in particular for Boson Sampling devices. It is thus crucial to develop tools to certify genuine multiphoton interference between multiple sources. Here we show that so-called Sylvester interferometers are near-optimal for the task of discriminating the behaviors of distinguishable and indistinguishable photons. We report the first implementations of integrated Sylvester interferometers with 4 and 8 modes with an efficient, scalable and reliable 3D-architecture. We perform two-photon interference experiments capable of identifying indistinguishable photon behaviour with a Bayesian approach using very small data sets. Furthermore, we employ experimentally this new device for the assessment of scattershot Boson Sampling. These results open the way to the application of Sylvester interferometers for the optimal assessment of multiphoton interference experiments.Comment: 9+10 pages, 6+6 figures, added supplementary material, completed and updated bibliograph

    Expectation Maximization for Hard X-ray Count Modulation Profiles

    Full text link
    This paper is concerned with the image reconstruction problem when the measured data are solar hard X-ray modulation profiles obtained from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI)} instrument. Our goal is to demonstrate that a statistical iterative method classically applied to the image deconvolution problem is very effective when utilized for the analysis of count modulation profiles in solar hard X-ray imaging based on Rotating Modulation Collimators. The algorithm described in this paper solves the maximum likelihood problem iteratively and encoding a positivity constraint into the iterative optimization scheme. The result is therefore a classical Expectation Maximization method this time applied not to an image deconvolution problem but to image reconstruction from count modulation profiles. The technical reason that makes our implementation particularly effective in this application is the use of a very reliable stopping rule which is able to regularize the solution providing, at the same time, a very satisfactory Cash-statistic (C-statistic). The method is applied to both reproduce synthetic flaring configurations and reconstruct images from experimental data corresponding to three real events. In this second case, the performance of Expectation Maximization, when compared to Pixon image reconstruction, shows a comparable accuracy and a notably reduced computational burden; when compared to CLEAN, shows a better fidelity with respect to the measurements with a comparable computational effectiveness. If optimally stopped, Expectation Maximization represents a very reliable method for image reconstruction in the RHESSI context when count modulation profiles are used as input data

    Evaluation of Probabilistic Methodology for Predicting Satellite Tracking Resources

    Get PDF
    This research evaluates a probabilistic methodology for estimating the ability of satellite tracking networks to provide tracking and data acquisition services to large constellations of satellites. This approach, developed by Hagar is evaluated using Monte Carlo simulations of optimal satellite contact scheduling on a tracking network for a certain class of satellites. The actual results of the scheduled Monte Carlo simulations were then compared to the predicted values computed with Hagar\u27s methodology for a range constellation and network sizes. Comparison methods include percent difference, a Wilcoxon signed ranks test and a Mann-Whitney U test. The Monte Carlo simulations were run for only low earth orbit (LEO) satellites in circular orbit at random altitudes ranging from 180km to 1000km, and inclinations from near equatorial to near polar. For each Monte Carlo sample the orbit plane orientations and initial satellite positions were randomly generated. Ninety-six different cases were simulated and compared to their respective counterparts using the probabilistic approach. The results indicate that the probabilistic method is not finished. Although the method is fair in its approximation of network capabilities it lacks the accuracy to be used as a single tool for analysis of network capabilities. With additional research and adjustment the method could give satellite network users and planners a useful tool for predicting the ability of tracking and data acquisition networks to meet current and projected satellite tracking needs

    Human-Centered Systems Analysis of Aircraft Separation from Adverse Weather

    Get PDF
    Adverse weather significantly impacts the safety and efficiency of flight operations. Weather information plays a key role in mitigating the impact of adverse weather on flight operations by supporting air transportation decision-makers’ awareness of operational and mission risks. The emergence of new technologies for the surveillance, modeling, dissemination and presentation of information provides opportunities for improving both weather information and user decision-making. In order to support the development of new weather information systems, it is important to understand this complex problem thoroughly. This thesis applies a human-centered systems engineering approach to study the problem of separating aircraft from adverse weather. The approach explicitly considers the role of the human operator as part of the larger operational system. A series of models describing the interaction of the key elements of the adverse aircraft-weather encounter problem and a framework that characterizes users’ temporal decisionmaking were developed. Another framework that better matches pilots’ perspectives compared to traditional forecast verification methods articulated the value of forecast valid time according to a spacetime reference frame. The models and frameworks were validated using focused interviews with ten national subject matter experts in aviation meteorology or flight operations. The experts unanimously supported the general structure of the models and made suggestions on clarifications and refinements which were integrated in the final models. In addition, a cognitive walk-through of three adverse aircraft-weather encounters was conducted to provide an experiential perspective on the aviation weather problem. The scenarios were chosen to represent three of the most significant aviation weather hazards: icing, convective weather and low ceilings and visibility. They were built on actual meteorological information and the missions and pilot decisions were synthesized to investigate important weather encounter events. The cognitive walkthrough and the models were then used to identify opportunities for improving weather information and training. Of these, the most significant include opportunities to address users’ four-dimensional trajectorycentric perspectives and opportunities to improve the ability of pilots to make contingency plans when dealing with stochastic information

    Camouflage assessment:Machine and human

    Get PDF
    A vision model is designed using low-level vision principles so that it can perform as a human observer model for camouflage assessment. In a camouflaged-object assessment task, using military patterns in an outdoor environment, human performance at detection and recognition is compared with the human observer model. This involved field data acquisition and subsequent image calibration, a human experiment, and the design of the vision model. Human and machine performance, at recognition and detection, of military patterns in two environments was found to correlate highly. Our model offers an inexpensive, automated, and objective method for the assessment of camouflage where it is impractical, or too expensive, to use human observers to evaluate the conspicuity of a large number of candidate patterns. Furthermore, the method should generalize to the assessment of visual conspicuity in non-military contexts.</p

    Supporting Intelligent and Trustworthy Maritime Path Planning Decisions

    Get PDF
    The risk of maritime collisions and groundings has dramatically increased in the past five years despite technological advancements such as GPS-based navigation tools and electronic charts which may add to, instead of reduce, workload. We propose that an automated path planning tool for littoral navigation can reduce workload and improve overall system efficiency, particularly under time pressure. To this end, a Maritime Automated Path Planner (MAPP) was developed, incorporating information requirements developed from a cognitive task analysis, with special emphasis on designing for trust. Human-in-the-loop experimental results showed that MAPP was successful in reducing the time required to generate an optimized path, as well as reducing path lengths. The results also showed that while users gave the tool high acceptance ratings, they rated the MAPP as average for trust, which we propose is the appropriate level of trust for such a system.This work was sponsored by Rite Solutions Inc., Assett Inc., Mikel Inc., and the Office of Naval Research. We would also like to thank Northeast Maritime Institute, the MIT NROTC detachment, the crew of the USS New Hampshire, and the anonymous reviewers whose comments significantly improved the paper

    Prediction of visibility and aerosol within the operational Met Office unified model. I: Model formulation and variational assimilation

    Get PDF
    The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast time
    • …
    corecore