1,657 research outputs found

    12 loops and triple wrapping in ABJM theory from integrability

    Get PDF
    Adapting a method recently proposed by C. Marboe and D. Volin for N{\cal N}=4 super-Yang-Mills, we develop an algorithm for a systematic weak coupling expansion of the spectrum of anomalous dimensions in the sl(2)sl(2)-like sector of planar N\mathcal{N}=6 super-Chern-Simons. The method relies on the Quantum Spectral Curve formulation of the problem and the expansion is written in terms of the interpolating function h(λ)h(\lambda), with coefficients expressible as combinations of Euler-Zagier sums with alternating signs. We present explicit results up to 12 loops (six nontrivial orders) for various twist L=1 and L=2 operators, corresponding to triple and double wrapping terms, respectively, which are beyond the reach of the Asymptotic Bethe Ansatz as well as L\"uscher's corrections. The algorithm works for generic values of L and S and in principle can be used to compute arbitrary orders of the weak coupling expansion. For the simplest operator with L=1 and spin S=1, the Pad\'e extrapolation of the 12-loop result nicely agrees with the available Thermodynamic Bethe Ansatz data in a relatively wide range of values of the coupling. A Mathematica notebook with a selection of results is attached.Comment: 31 pages, 1 figure. A Mathematica notebook with a selection of results is attached (please download the compressed file "Results.nb" listed under "Other formats"). v2: typos corrected; more precise checks of the results; an earlier incorrect version of the figure was replaced. Published in JHE

    Particle sorting by a structured microfluidic ratchet device with tunable selectivity: Theory and Experiment

    Full text link
    We theoretically predict and experimentally demonstrate that several different particle species can be separated from each other by means of a ratchet device, consisting of periodically arranged triangular (ratchet) shaped obstacles. We propose an explicit algorithm for suitably tailoring the externally applied, time-dependent voltage protocol so that one or several, arbitrarily selected particle species are forced to migrate oppositely to all the remaining species. As an example we present numerical simulations for a mixture of five species, labelled according to their increasing size, so that species 2 and 4 simultaneously move in one direction and species 1, 3, and 5 in the other. The selection of species to be separated from the others can be changed at any time by simply adapting the voltage protocol. This general theoretical concept to utilize one device for many different sorting tasks is experimentally confirmed for a mixture of three colloidal particle species

    Rossberg landslide history and flood chronology as recorded in Lake Lauerz sediments (Central Switzerland)

    Get PDF
    The southern slopes of Rossberg mountain, Central Switzerland, on which one of the largest historic landslides of the Alpine region was released in 1806 ad (Goldauer Bergsturz), are prone to large-scale mass wasting processes. This has led to numerous sliding events, which are well-recognizable in the modern topography but lack accurate dating. In order to provide new insights into the timing and the processes associated with past landslides as well as into the frequency of exceptional flood events, long sediment cores were retrieved from the subsurface of Lake Lauerz that lies in the pathway of these landslides and that records strong runoff events with typical flood layers. Analyses of the recovered cores display a sedimentologic succession with variable fingerprints of past landslides and flood events, depending on the coring location within the lake. The landslide signature can be calibrated using the 1806 ad event: An organic-rich peaty unit, which is found in two cores located close to the rockmass impact, points towards a sudden, gravity spreading-induced lateral displacement of the swampy plain where parts of the rock mass were accumulating. This rapid lateral mobilization of soft sediments, and not the rock masses, acted as ultimate trigger for the reported ~15m-high impulse waves on the lake. In the more distal areas, the 1806 ad event led to the deposition of a thick, organic-rich redeposited layer. The 10m-long core from the distal basin covers a radiocarbon-dated ~2,000years sedimentation history and contains a highly similar event layer that was deposited in 810±60 ad. This layer is most likely the product of a major historic landslide, known as Röthener Bergsturz, which, based on scarce historical reports, was commonly dated to 1222 ad. In the 2,000years record, we identify three periods with enhanced occurrence of flood turbidites dated to 580-850 ad, 990-1420 ad, and 1630-1940 ad. Among the 54 detected flood layers, 6 probably mark exceptionally heavy rainfall events that are dated to ~610, ~1160, ~1290, ~1660, ~1850, and ~1876 ad, the latter being associated to one of the most intense rainfall events ever recorded instrumentally in the regio

    Sedimentological and stratigraphic framework of the several hundred thousand years old lacustrine record from Lake Van, Turkey

    Get PDF
    Within the frame of the International Continental scientific Drilling Program (ICDP) project PALEOVAN, a long and continuous sediment record from Lake Van, a closed lake situated in a climatically sensitive semiarid and tectonically active region in Eastern Anatolia, has been drilled in summer 2010. At two sites, Ahlat Ridge and Northern Basin, sedimentary records of 220 and 140 m were recovered, respectively. With basal ages possibly around 500'000 years, these records span several glacial-interglacial cycles and reach back until the lake’s initial transgression in the Middle Pleistocene. First results from ongoing analysis of core-catcher samples and newly opened cores document the sedimentological and geochemical succession. Two composite profiles of the drill sites were defined. Core catcher-based geochemical data such as proxies of lake’s productivity and catchment alterations show large variations and reflect a rich paleoenvironmental history. Most of the 220 m thick succession consists of carbonate mud, mostly sub-mm-thick laminated and interbedded by either homogenous mud or pyroclastic cm-thick layers. The lowermost sediments from the Ahlat Ridge site represent the initial lake transition as the drilling could not penetrate further and the seismic data indicates coincidence with the ‘acoustic’ basement. Such an early transgressive state of the lake’s history is also supported by the lithology consisting of a gravel unit as an indicator of a beach-like environment, which is overlain by sand deposits containing fresh-water gastropods (Bithynia). Above 200 mblf, the laminated mud clearly indicates that the lake was already deep enough to form anoxic bottom water as the laminations were preserved. This unique paleoclimate archive indicates that great changes of the depositional conditions occurred that hint to a fascinating evolution of the environment and has ideal prerequisites for the investigation of the Quaternary climate evolution in the Near East

    3D manufacturing tolerancing with probing of a local work coordinate system

    Get PDF
    International audienceThe safety and performance requirements for mechanisms are such that the necessary accuracy of part geometry is difficult to reach using classical manufacturing processes. This paper proposes a manufacturing tolerance stack-up method based on the analysis line method. This technique enables both the analysis and the synthesis of ISO manufacturing specifications through a new approach which relies on production specifications, adjustment specifications and their analysis to stack up the 3D resultant. The originality of the method resides in the 3D calculation for location requirements, which takes into account angular effects and probing operations on numerical-control machine-tools in order to define a local Work Coordinate System (WCS). For achieving tolerance analysis, deviations are modelled using Small-Displacement Torsor. This tolerance analysis method enables one to determine explicit three-dimensional linear relations between manufacturing tolerances and functional requirements. These relations can be used as constraints for tolerance optimization

    Quantum number preserving ansätze and error mitigation studies for the variational quantum eigensolver

    Get PDF
    Computational chemistry has advanced rapidly in the last decade on the back of the progress of increased performance in CPU and GPU based computation. The prediction of reaction properties of varying chemical compounds in silico promises to speed up development in, e.g., new catalytic processes to reduce energy demand of varying known industrial used reactions. Theoretical chemistry has found ways to approximate the complexity of the underlying intractable quantum many-body problem to various degrees to achieve chemically accurate ab initio calculations for various, experimentally verified systems. Still, in theory limited by fundamental complexity theorems accurate and reliable predictions for large and/or highly correlated systems elude computational chemists today. As solving the Schrödinger equation is one of the main use cases of quantum computation, as originally envisioned by Feynman himself, computational chemistry has emerged as one of the applications of quantum computers in industry, originally motivated by potential exponential improvements in quantum phase estimation over classical counterparts. As of today, most rigorous speed ups found in quantum algorithms are only applicable for so called error-corrected quantum computers, which are not limited by local qubit decoherence in the length of the algorithms possible. Over the last decade, the size of available quantum computing hardware has steadily increased and first proof of concepts of error-correction codes have been achieved in the last year, reducing error rates below the individual error rates of qubits comprising the code. Still, fully error-corrected quantum computers in sizes that overcome the constant factor in speed up separating classical and quantum algorithms in increasing system size are a decade or more away. Meanwhile, considerable efforts have been made to find potential quantum speed ups of non-error corrected quantum systems for various applications in the noisy intermediate-scale quantum (NISQ) era. In chemistry, the variational quantum eigensolver (VQE), a family of classical-quantum hybrid algorithms, has become a topic of interest as a way of potentially solving computational chemistry problems on current quantum hardware. The main contributions of this work are: extending the VQE framework with two new potential ansätze, (1) a maximally dense first-order trotterized ansatz for the paired approximation of the electronic structure Hamiltonian, (2) a gate fabric with many favourable properties like conserving relevant quantum numbers, locality of individual operations and potential initialisation strategies mitigating plateaus of vanishing gradient during optimisation. (3) Contributions to one of largest and most complex VQE to date, including the aforementioned ansatz in paired approximation, benchmarking different error-mitigation techniques to achieve accurate results, extrapolating performance to give perspective on what is needed for NISQ devices having potential in competing with classical algorithms and (4) Simulations to find optimal ways of measuring Hamiltonians in this error-mitigated framework. (5) Furthermore a simulation of different purification error mitigation techniques and their combination under different noise models and a way of efficiently calibrating for coherent noise for one of them is part of this manuscript. We discuss the state of VQE after almost a decade after its introduction and give an outlook on computational chemistry on quantum computers in the near future
    • …
    corecore