6,636 research outputs found
Quantum Key Distribution (QKD) and Commodity Security Protocols: Introduction and Integration
We present an overview of quantum key distribution (QKD), a secure key
exchange method based on the quantum laws of physics rather than computational
complexity. We also provide an overview of the two most widely used commodity
security protocols, IPsec and TLS. Pursuing a key exchange model, we propose
how QKD could be integrated into these security applications. For such a QKD
integration we propose a support layer that provides a set of common QKD
services between the QKD protocol and the security applicationsComment: 12Page
60 GHz MAC Standardization: Progress and Way Forward
Communication at mmWave frequencies has been the focus in the recent years.
In this paper, we discuss standardization efforts in 60 GHz short range
communication and the progress therein. We compare the available standards in
terms of network architecture, medium access control mechanisms, physical layer
techniques and several other features. Comparative analysis indicates that IEEE
802.11ad is likely to lead the short-range indoor communication at 60 GHz. We
bring to the fore resolved and unresolved issues pertaining to robust WLAN
connectivity at 60 GHz. Further, we discuss the role of mmWave bands in 5G
communication scenarios and highlight the further efforts required in terms of
research and standardization
Demystifying Big Data Adoption: Beyond IT Fashion and Relative Advantage
There is a paradox in big data adoption: a peak of hype and simultaneously an unexpectedly low deployment rate. The present multiple case study research develops a Big Data Adoption (Big2) model that helps to explain this paradox and sheds light on the “whether”, “why”, and “how” questions regarding big data adoption. The Big2 model extends beyond the existing Relative Advantage and IT Fashion theories to include organizational, environmental, social variables as well as new psychological factors that are unique to big data adoption. Our analysis reveals that the outcome of big data adoption is indeterministic, which defies the implicit assumption of most simplistic “rational-calculus” models of innovation adoption: Relative Advantage is a necessary but not sufficient condition for big data adoption. Most importantly, our study uncovered a “Deployment Gap” and a “Limbo Stage” where companies continuously experiment for a long time and do not proceed to deployment despite the intent to adopt big data. As a result there are four big data adoption categories: Not adopting, Experimented but Not Adopting, Not Yet Deployed, Deployed. Our Big2 model contributes to provide a Paradigm Shift and Complexity Tolerance perspective to understand the “why” in each of the 4 adoption categories. This study further identifies 9 complexity tolerance strategies to help narrow the Deployment Gap but also shows that big data is not for everyone
- …