7,927 research outputs found

    Diffraction-limited CCD imaging with faint reference stars

    Get PDF
    By selecting short exposure images taken using a CCD with negligible readout noise we obtained essentially diffraction-limited 810 nm images of faint objects using nearby reference stars brighter than I=16 at a 2.56 m telescope. The FWHM of the isoplanatic patch for the technique is found to be 50 arcseconds, providing ~20% sky coverage around suitable reference stars.Comment: 4 page letter accepted for publication in Astronomy and Astrophysic

    Clean-Up after Territorial Oil Spills in the Arctic

    Get PDF
    … The nature of a spill will depend on the local weather conditions, including the presence or absence of snow, the absorptive capacity of the ground (which is influenced by the prevailing groundwater level), and local topography. … In clean-up operations after oil spills, the first requirement is to contain the spill in as small an area as possible, and to prevent it from reaching water courses and thus contaminating their environments. Devices and techniques can then be employed to remove oil from the contaminated region and convey it to temporary storage. Thirdly, the area can be treated to remove residual oil and promote its early restoration through the use of chemical and biological techniques. … In temperate regions, terrestrial spills are most readily contained by artificial dykes or dams constructed by means of earth-moving equipment, and it is also possible to dig trenches and ditches into which oil will flow and be retained. In areas of permafrost, however, suitable damming material may not be readily available, or may be obtained only if considerable areas of permafrost are exposed - that is, at the cost of additional environmental damage. The use of heavy vehicles, even if they are available, will compact the insulating active layer of permafrost and thereby cause eventual melting of the permafrost. Containment should involve a minimum of disturbance of the area, with no removal or compaction of the active layer and exposure of permafrost. A method of containment which may be feasible is to use damming material that can be quickly transported to the site and installed without the aid of machinery. For example, corrugated metal sheeting in sections about three feet high by ten feet long (1 m x 3 m approximately), with vertical corrugations, could be driven through the active layer down to the water table or frost level or thawed clay soil, all of which provide a basement to vertical oil penetration, and retained in position by T-bar stakes driven into the ground. … The presence of permafrost ironically brings the substantial benefit of there being little of the infiltration of oil into porous soils, with subsequent ground-water contamination, which constitutes such a severe problem in temperate regions; that is, clean-up operations can be facilitated by the presence of permafrost. Another approach to containment, which was tested briefly during the summer of 1974 on wet tundra on Richards Island in the Mackenzie Delta, is to cut a trench, 30 cm wide, to permafrost level across the path of the flowing oil. The trench successfully intercepts the flow of oil, both on and below the surface, and drains it to a low-level point from where it can be pumped to storage or for disposal. … A dam or trench of the type just described, which would necessarily have to be located on the downslope side of the area of spillage, would interfere with natural drainage, and so it would be necessary to control drainage from the area while oil and water were being separated. The present authors suggest that this control could be effected by the installation of an API (American Petroleum Institute) type of oil-water separator which can be constructed easily from prefabricated metal sheeting, usually about 5 feet deep by 10 feet wide by 30 feet long (1.5 m x 3 m x 9 m approximately). … Another possibility would be the use of a compact plate-type oil-water separator. … A significant further advantage of the general technique just explained is the possibility of controlling, and even accelerating, the flow from the area of spillage. … Small-scale laboratory tests have demonstrated that significant proportions of the absorbed oil can be floated out of detritus by gravity alone and without agitation. It is likely that, due to its slow rate of evaporation, the sub-surface oil would maintain a viscosity sufficiently low for it to be floated out by water. It is generally recognized, also, that the toxic constituents of oil are the most volatile and water-soluble. Thus, it is likely that the oil would exhibit toxicity only during the first few months after spillage, and then be permanently absorbed in the vegetation and soil and become immobilized. … As a final stage of restoration of an affected area, it may prove beneficial to fertilize it and promote the growth of oil-degrading microorganisms. Since the albedo of the area will be reduced, and so there will be greater absorption of radiation and increased depth of active layer, it may be desirable to increase the albedo artificially be sprinkling the area with reflective materials. In conclusion, the present authors contend that new techniques must be developed for the clean-up of terrestrial spills in the Arctic, since methods used in temperate regions are inappropriate. &hellip

    Belief propagation algorithm for computing correlation functions in finite-temperature quantum many-body systems on loopy graphs

    Get PDF
    Belief propagation -- a powerful heuristic method to solve inference problems involving a large number of random variables -- was recently generalized to quantum theory. Like its classical counterpart, this algorithm is exact on trees when the appropriate independence conditions are met and is expected to provide reliable approximations when operated on loopy graphs. In this paper, we benchmark the performances of loopy quantum belief propagation (QBP) in the context of finite-tempereture quantum many-body physics. Our results indicate that QBP provides reliable estimates of the high-temperature correlation function when the typical loop size in the graph is large. As such, it is suitable e.g. for the study of quantum spin glasses on Bethe lattices and the decoding of sparse quantum error correction codes.Comment: 5 pages, 4 figure

    Optimal and Efficient Decoding of Concatenated Quantum Block Codes

    Get PDF
    We consider the problem of optimally decoding a quantum error correction code -- that is to find the optimal recovery procedure given the outcomes of partial "check" measurements on the system. In general, this problem is NP-hard. However, we demonstrate that for concatenated block codes, the optimal decoding can be efficiently computed using a message passing algorithm. We compare the performance of the message passing algorithm to that of the widespread blockwise hard decoding technique. Our Monte Carlo results using the 5 qubit and Steane's code on a depolarizing channel demonstrate significant advantages of the message passing algorithms in two respects. 1) Optimal decoding increases by as much as 94% the error threshold below which the error correction procedure can be used to reliably send information over a noisy channel. 2) For noise levels below these thresholds, the probability of error after optimal decoding is suppressed at a significantly higher rate, leading to a substantial reduction of the error correction overhead.Comment: Published versio

    Optimized supernova constraints on dark energy evolution

    Get PDF
    A model-independent method to study the possible evolution of dark energy is presented. Optimal estimates of the dark energy equation of state w are obtained from current supernovae data from Riess et al. (2004) following a principal components approach. We assess the impact of varying the number of piecewise constant estimates of w using a model selection method, the Bayesian information criterion, and compare the most favored models with some parametrizations commonly used in the literature. Although data seem to prefer a cosmological constant, some models are only moderately disfavored by our selection criterion: a constant w, w linear in the scale factor, w linear in redshift and the two-parameter models introduced here. Among these, the models we find by optimization are slightly preferred. However, current data do not allow us to draw a conclusion on the possible evolution of dark energy. Interestingly, the best fits for all varying-w models exhibit a w<-1 at low redshifts.Comment: 6 pages, 4 figures; typos removed and reference added to match published versio

    Generalizing with perceptrons in case of structured phase- and pattern-spaces

    Full text link
    We investigate the influence of different kinds of structure on the learning behaviour of a perceptron performing a classification task defined by a teacher rule. The underlying pattern distribution is permitted to have spatial correlations. The prior distribution for the teacher coupling vectors itself is assumed to be nonuniform. Thus classification tasks of quite different difficulty are included. As learning algorithms we discuss Hebbian learning, Gibbs learning, and Bayesian learning with different priors, using methods from statistics and the replica formalism. We find that the Hebb rule is quite sensitive to the structure of the actual learning problem, failing asymptotically in most cases. Contrarily, the behaviour of the more sophisticated methods of Gibbs and Bayes learning is influenced by the spatial correlations only in an intermediate regime of α\alpha, where α\alpha specifies the size of the training set. Concerning the Bayesian case we show, how enhanced prior knowledge improves the performance.Comment: LaTeX, 32 pages with eps-figs, accepted by J Phys

    Future evolution and uncertainty of river flow regime change in a deglaciating river basin

    Get PDF
    The flow regimes of glacier-fed rivers are sensitive to climate change due to strong climate–cryosphere–hydrosphere interactions. Previous modelling studies have projected changes in annual and seasonal flow magnitude but neglect other changes in river flow regime that also have socio-economic and environmental impacts. This study employs a signature-based analysis of climate change impacts on the river flow regime for the deglaciating Virkisá river basin in southern Iceland. Twenty-five metrics (signatures) are derived from 21st century projections of river flow time series to evaluate changes in different characteristics (magnitude, timing and variability) of river flow regime over sub-daily to decadal timescales. The projections are produced by a model chain that links numerical models of climate and glacio-hydrology. Five components of the model chain are perturbed to represent their uncertainty including the emission scenario, numerical climate model, downscaling procedure, snow/ice melt model and runoff-routing model. The results show that the magnitude, timing and variability of glacier-fed river flows over a range of timescales will change in response to climate change. For most signatures there is high confidence in the direction of change, but the magnitude is uncertain. A decomposition of the projection uncertainties using analysis of variance (ANOVA) shows that all five perturbed model chain components contribute to projection uncertainty, but their relative contributions vary across the signatures of river flow. For example, the numerical climate model is the dominant source of uncertainty for projections of high-magnitude, quick-release flows, while the runoff-routing model is most important for signatures related to low-magnitude, slow-release flows. The emission scenario dominates mean monthly flow projection uncertainty, but during the transition from the cold to melt season (April and May) the snow/ice melt model contributes up to 23&thinsp;% of projection uncertainty. Signature-based decompositions of projection uncertainty can be used to better design impact studies to provide more robust projections.</p
    • …
    corecore