7,464 research outputs found
Quantitative Analysis for Authentication of Low-cost RFID Tags
Formal analysis techniques are widely used today in order to verify and
analyze communication protocols. In this work, we launch a quantitative
verification analysis for the low- cost Radio Frequency Identification (RFID)
protocol proposed by Song and Mitchell. The analysis exploits a Discrete-Time
Markov Chain (DTMC) using the well-known PRISM model checker. We have managed
to represent up to 100 RFID tags communicating with a reader and quantify each
RFID session according to the protocol's computation and transmission cost
requirements. As a consequence, not only does the proposed analysis provide
quantitative verification results, but also it constitutes a methodology for
RFID designers who want to validate their products under specific cost
requirements.Comment: To appear in the 36th IEEE Conference on Local Computer Networks (LCN
2011
Cryptanalysis of two mutual authentication protocols for low-cost RFID
Radio Frequency Identification (RFID) is appearing as a favorite technology
for automated identification, which can be widely applied to many applications
such as e-passport, supply chain management and ticketing. However, researchers
have found many security and privacy problems along RFID technology. In recent
years, many researchers are interested in RFID authentication protocols and
their security flaws. In this paper, we analyze two of the newest RFID
authentication protocols which proposed by Fu et al. and Li et al. from several
security viewpoints. We present different attacks such as desynchronization
attack and privacy analysis over these protocols.Comment: 17 pages, 2 figures, 1 table, International Journal of Distributed
and Parallel system
Optimal security limits of RFID distance bounding protocols
In this paper, we classify the RFID distance bounding protocols having bitwise fast phases and no final signature. We also give the theoretical security bounds for two specific classes, leaving the security bounds for the general case as an open problem. As for the classification, we introduce the notion of k-previous challenge dependent (k-PCD) protocols where each response bit depends on the current and k-previous challenges and there is no final signature. We treat the case k = 0, which means each response bit depends only on the current challenge, as a special case and define such protocols as current challenge dependent (CCD) protocols. In general, we construct a trade-off curve between the security levels of mafia and distance frauds by introducing two generic attack algorithms. This leads to the conclusion that CCD protocols cannot attain the ideal security against distance fraud, i.e. 1/2, for each challenge-response bit, without totally losing the security against mafia fraud. We extend the generic attacks to 1-PCD protocols and obtain a trade-off curve for 1-PCD protocols pointing out that 1-PCD protocols can provide better security than CCD protocols. Thereby, we propose a natural extension of a CCD protocol to a 1-PCD protocol in order to improve its security. As a study case, we give two natural extensions of Hancke and Kuhn protocol to show how to enhance the security against either mafia fraud or distance fraud without extra cost
A Fault Analytic Method against HB+
The search for lightweight authentication protocols suitable for low-cost
RFID tags constitutes an active and challenging research area. In this context,
a family of protocols based on the LPN problem has been proposed: the so-called
HB-family. Despite the rich literature regarding the cryptanalysis of these
protocols, there are no published results about the impact of fault analysis
over them. The purpose of this paper is to fill this gap by presenting a fault
analytic method against a prominent member of the HB-family: HB+ protocol. We
demonstrate that the fault analysis model can lead to a flexible and effective
attack against HB-like protocols, posing a serious threat over them
Expected loss analysis of thresholded authentication protocols in noisy conditions
A number of authentication protocols have been proposed recently, where at
least some part of the authentication is performed during a phase, lasting
rounds, with no error correction. This requires assigning an acceptable
threshold for the number of detected errors. This paper describes a framework
enabling an expected loss analysis for all the protocols in this family.
Furthermore, computationally simple methods to obtain nearly optimal value of
the threshold, as well as for the number of rounds is suggested. Finally, a
method to adaptively select both the number of rounds and the threshold is
proposed.Comment: 17 pages, 2 figures; draf
- …