222 research outputs found
On the maximal weight of -ary chain partitions with bounded parts
A -ary chain is a special type of chain partition of integers with
parts of the form for some fixed integers and . In this note,
we are interested in the maximal weight of such partitions when their parts are
distinct and cannot exceed a given bound . Characterizing the cases where
the greedy choice fails, we prove that this maximal weight is, as a function of
, asymptotically independent of , and we provide an efficient
algorithm to compute it.Comment: 17 page
Radix-10 BKM Algorithm for Computing Transcendentals on Pocket Computers
We present a radix-10 variant of the BKM algorithm. It is a shift-and-add, CORDIC-like algorithm that allows fast computation of complex exponentials and logarithms. This radix-10 version is suitable for implementation in a pocket computer.Nous proposons une variante de l'algorithme BKM adaptée au calcul en base de 10. C'est un algorithme à additions et décalages, qui permet d'évaluer rapidement des exponentielles et logarithmes complexes. cette version adaptée à l base de 10 est destinée à l'implantation des calculatrices de poch
An Alternative Approach for SIDH Arithmetic
In this paper, we present new algorithms for the field arithmetic of supersingular isogeny Diffie-Hellman; one of the fifteen remaining candidates in the NIST post-quantum standardization process. Our approach uses a polynomial representation of the field elements together with mechanisms to keep the coefficients within bounds during the arithmetic operations. We present timings and comparisons for SIKEp503 and suggest a novel 736-bit prime that offers a speedup compared to SIKEp751 for a similar level of security
Diophantine Approximation, Ostrowski Numeration and the Double-Base Number System
Analysis of Algorithm
Faster cofactorization with ECM using mixed representations
This paper introduces a novel implementation of the elliptic curve factoring method specifically designed for medium-size integers such as those arising by billions in the cofactorization step of the number field sieve. In this context, our algorithm requires fewer modular multiplications than any other publicly available implementation. The main ingredients are: the use of batches of primes, fast point tripling, optimal double-base decompositions and Lucas chains, and a good mix of Edwards and Montgomery representations
Hybrid Binary-Ternary Joint Sparse Form and its Application in Elliptic Curve Cryptography
Multi-exponentiation is a common and time consuming operation in public-key cryptography. Its elliptic curve counterpart, called multi-scalar multiplication is extensively used for digital signature verification. Several algorithms have been proposed to speed-up those critical computations. They are based on simultaneously recoding a set of integers in order to minimize the number of general multiplications or point additions. When signed-digit recoding techniques can be used, as in the world of elliptic curves, Joint Sparse Form (JSF) and interleaving -NAF are the most efficient algorithms. In this paper, a novel recoding algorithm for a pair of integers is proposed, based on a decomposition that mixes powers of 2 and powers of 3. The so-called Hybrid Binary-Ternary Joint Sparse Form require fewer digits and is sparser than the JSF and the interleaving -NAF. Its advantages are illustrated for elliptic curve double-scalar multiplication; the operation counts show a gain of up to 18\%
Randomized Mixed-Radix Scalar Multiplication
A covering system of congruences can be defined as a set of congruence
relations of the form: for satisfying the property that for
every integer in , there exists at least an index such that . First, we show that most existing
scalar multiplication algorithms can be formulated in terms of covering
systems of congruences. Then, using a special form of covering systems called
exact \mbox{-covers}, we present a novel uniformly randomized scalar
multiplication algorithm with built-in protections against various types of
side-channel attacks. This algorithm can be an alternative to Coron\u27s scalar
blinding technique for elliptic curves, in particular when the choice of a
particular finite field tailored for speed compels to use a large random
factor
Randomizing scalar multiplication using exact covering systems of congruences
A covering system of congruences can be defined as a set of congruence relations of the form: for satisfying the property that for every integer in , there exists at least an index such that . First, we show that most existing scalar multiplication algorithms can be formulated in terms of covering systems of congruences. Then, using a special form of covering systems called exact -covers, we present a novel uniformly randomized scalar multiplication algorithm that may be used to counter differential side-channel attacks, and more generally physical attacks that require multiple executions of the algorithm. This algorithm can be an alternative to Coron\u27s scalar blinding technique for elliptic curves, in particular when the choice of a particular finite field tailored for speed compels to use a large random factor
Improving Goldschmidt Division, Square Root and Square Root Reciprocal
The aim of this paper is to accelerate division, square root and square root reciprocal computations, when Goldschmidt method is used on a pipelined multiplier. This is done by replacing the last iteration by the addition of a correcting term that can be looked up during the early iterations. We describe several variants of the Goldschmidt algorithm assuming 4-cycle pipelined multiplier and discuss obtained number of cycles and error achieved. Extensions to other than 4-cycle multipliers are given.Le but de cet article est l'accélération de la division, et du calcul de racines carrées et d'inverses de racines carrées lorsque la méthode de Goldschmidt est utilisée sur un multiplieur pipe-line. Nous faisons ceci en remplaçant la dernière itération par l'addition d'un terme de correction qui peut être déduit d'une lecture de table effectuée lors des premières itérations. Nous décrivons plusieurs variantes de l'algorithme obtenu en supposant un multiplieur à 4 étages de pipe-line, et donnons pour chaque variante l'erreur obtenue et le nombre de cycles de calcul. Des extensions de ce travail à des multiplieurs dont le nombre d'étages est différent sont présentées
Meta-analyses of FibroTest diagnostic value in chronic liver disease
<p>Abstract</p> <p>Background</p> <p>FibroTest (FT) is a biomarker of liver fibrosis initially validated in patients with chronic hepatitis C (CHC).</p> <p>The aim was to test two hypotheses, one, that the FT diagnostic value was similar in the three other frequent fibrotic diseases: chronic hepatitis B (CHB), alcoholic liver disease (ALD) and non-alcoholic fatty liver disease (NAFLD); and the other, that the FT diagnostic value was similar for intermediate and extreme fibrosis stages.</p> <p>Methods</p> <p>The main end points were the FT area under the ROC curves (AUROCs) for the diagnosis of bridging fibrosis (F2F3F4 vs. F0F1), standardized for the spectrum of fibrosis stages, and the comparison of FT AUROCs between adjacent stages. Two meta-analyses were performed: one combining all the published studies (random model), and one of an integrated data base combining individual data. Sensitivity analysis integrated the independency of authors, lenght of biopsy, prospective design, respect of procedures, comorbidities, and duration between biopsy and serum sampling.</p> <p>Results</p> <p>A total of 30 studies were included which pooled 6,378 subjects with both FT and biopsy (3,501 HCV, 1,457 HBV, 267 NAFLD, 429 ALD, and 724 mixed). Individual data were analyzed in 3,282 patients. The mean standardized AUROC was 0.84 (95% CI, 0.83–0.86), without differences between causes of liver disease: HCV 0.85 (0.82–0.87), HBV 0.80 (0.77–0.84), NAFLD 0.84 (0.76–0.92), ALD 0.86 (0.80–0.92), mixed 0.85 (0.80–0.93). The AUROC for the diagnosis of the intermediate adjacent stages F2 vs. F1 (0.66; 0.63–0.68, n = 2,055) did not differ from that of the extreme stages F3 vs. F4 (0.69; 0.65–0.72, n = 817) or F1 vs. F0 (0.62; 0.59–0.65, n = 1788).</p> <p>Conclusion</p> <p>FibroTest is an effective alternative to biopsy in patients with chronic hepatitis C and B, ALD and NAFLD. The FT diagnostic value is similar for the diagnosis of intermediate and extreme fibrosis stages.</p
- …