88 research outputs found
Recommended from our members
OH and HO2 chemistry in the North Atlantic free troposphere
Interactions between atmospheric hydrogen oxides and aircraft nitrogen oxides determine the impact of aircraft exhaust on atmospheric chemistry. To study these interactions, the Subsonic Assessment: Ozone and Nitrogen Oxide Experiment (SONEX) assembled the most complete measurement complement to date for studying HO(x) (OH and HO2) chemistry in the free troposphere. Observed and modeled HO(x) agree on average to within experimental uncertainties (±40%). However, significant discrepancies occur as a function of NO and at solar zenith angles >70°. Some discrepancies appear to be removed by model adjustments to HO(x)-NO(x) chemistry, particularly by reducing HO2NO2 (PNA) and by including heterogeneous reactions on aerosols and cirrus clouds
Computational neuroanatomy: ontology-based representation of neural components and connectivity
Background: A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results: We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion: Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future
Solving discrete logarithms on a 170-bit MNT curve by pairing reduction
Pairing based cryptography is in a dangerous position following the
breakthroughs on discrete logarithms computations in finite fields of small
characteristic. Remaining instances are built over finite fields of large
characteristic and their security relies on the fact that the embedding field
of the underlying curve is relatively large. How large is debatable. The aim of
our work is to sustain the claim that the combination of degree 3 embedding and
too small finite fields obviously does not provide enough security. As a
computational example, we solve the DLP on a 170-bit MNT curve, by exploiting
the pairing embedding to a 508-bit, degree-3 extension of the base field.Comment: to appear in the Lecture Notes in Computer Science (LNCS
On the Bit Security of Elliptic Curve Diffie--Hellman
This paper gives the first bit security result for the elliptic curve Diffie--Hellman key exchange protocol for elliptic curves defined over prime fields. About of the most significant bits of the -coordinate of the Diffie--Hellman key are as hard to compute as the entire key. A similar result can be derived for the lower bits. The paper also generalizes and improves the result for elliptic curves over extension fields, that shows that computing one component (in the ground field) of the Diffie--Hellman key is as hard to compute as the entire key
A Family of Lightweight Twisted Edwards Curves for the Internet of Things
We introduce a set of four twisted Edwards curves that satisfy common security requirements and allow for fast implementations of scalar multiplication on 8, 16, and 32-bit processors. Our curves are defined by an equation of the form -x^2 + y^2 = 1 + dx^2y^2 over a prime field Fp, where d is a small non-square modulo p. The underlying prime fields are based on "pseudo-Mersenne" primes given by p = 2^k - c and have in common that p is congruent to 5 modulo 8, k is a multiple of 32 minus 1, and c is at most eight bits long. Due to these common features, our primes facilitate a parameterized implementation of the low-level arithmetic so that one and the same arithmetic function is able to process operands of different length. Each of the twisted Edwards curves we introduce in this paper is birationally equivalent to a Montgomery curve of the form -(A+2)y^2 = x^3 + Ax^2 + x where 4/(A+2) is small. Even though this contrasts with the usual practice of choosing A such that (A+2)/4 is small, we show that the Montgomery form of our curves allows for an equally efficient implementation of point doubling as Curve25519. The four curves we put forward roughly match the common security levels of 80, 96, 112 and 128 bits. In addition, their Weierstraß representations are isomorphic to curves of the form y^2 = x^3 - 3x + b so as to facilitate inter-operability with TinyECC and other legacy software
Computing Individual Discrete Logarithms Faster in GF with the NFS-DL Algorithm
International audienceThe Number Field Sieve (NFS) algorithm is the best known method to compute discrete logarithms (DL) in finite fields , with medium to large and small. This algorithm comprises four steps: polynomial selection, relation collection, linear algebra and finally, individual logarithm computation. The first step outputs two polynomials defining two number fields, and a map from the polynomial ring over the integers modulo each of these polynomials to . After the relation collection and linear algebra phases, the (virtual) logarithm of a subset of elements in each number field is known. Given the target element in , the fourth step computes a preimage in one number field. If one can write the target preimage as a product of elements of known (virtual) logarithm, then one can deduce the discrete logarithm of the target. As recently shown by the Logjam attack, this final step can be critical when it can be computed very quickly. But we realized that computing an individual DL is much slower in medium-and large-characteristic non-prime fields with , compared to prime fields and quadratic fields . We optimize the first part of individual DL: the \emph{booting step}, by reducing dramatically the size of the preimage norm. Its smoothness probability is higher, hence the running-time of the booting step is much improved. Our method is very efficient for small extension fields with and applies to any , in medium and large characteristic
Gravitational lensing analysis of the Kilo-Degree Survey
The Kilo-Degree Survey (KiDS) is a multi-band imaging survey designed for cosmological studies from weak lensing and photometric redshifts. It uses the European Southern Observatory VLT Survey Telescope with its wide-field camera OmegaCAM. KiDS images are taken in four filters similar to the Sloan Digital Sky Survey ugri bands. The best seeing time is reserved for deep r-band observations. The median 5σ limiting AB magnitude is 24.9 and the median seeing is below 0.7 arcsec. Initial KiDS observations have concentrated on the Galaxy and Mass Assembly (GAMA) regions near the celestial equator, where extensive, highly complete redshift catalogues are available. A total of 109 survey tiles, 1 square degree each, form the basis of the first set of lensing analyses of halo properties of GAMA galaxies. Nine galaxies per square arcminute enter the lensing analysis, for an effective inverse shear variance of 69 arcmin−2. Accounting for the shape measurement weight, the median redshift of the sources is 0.53. KiDS data processing follows two parallel tracks, one optimized for weak lensing measurement and one for accurate matched-aperture photometry (for photometric redshifts). This technical paper describes the lensing and photometric redshift measurements (including a detailed description of the Gaussian aperture and photometry pipeline), summarizes the data quality and presents extensive tests for systematic errors that might affect the lensing analyses. We also provide first demonstrations of the suitability of the data for cosmological measurements, and describe our blinding procedure for preventing confirmation bias in the scientific analyses. The KiDS catalogues presented in this paper are released to the community through http://kids.strw.leidenuniv.nl
- …