183 research outputs found
Quality of Information in Mobile Crowdsensing: Survey and Research Challenges
Smartphones have become the most pervasive devices in people's lives, and are
clearly transforming the way we live and perceive technology. Today's
smartphones benefit from almost ubiquitous Internet connectivity and come
equipped with a plethora of inexpensive yet powerful embedded sensors, such as
accelerometer, gyroscope, microphone, and camera. This unique combination has
enabled revolutionary applications based on the mobile crowdsensing paradigm,
such as real-time road traffic monitoring, air and noise pollution, crime
control, and wildlife monitoring, just to name a few. Differently from prior
sensing paradigms, humans are now the primary actors of the sensing process,
since they become fundamental in retrieving reliable and up-to-date information
about the event being monitored. As humans may behave unreliably or
maliciously, assessing and guaranteeing Quality of Information (QoI) becomes
more important than ever. In this paper, we provide a new framework for
defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the
current state-of-the-art on the topic. We also outline novel research
challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN
Lead telluride bonding and segmentation study Semiannual phase report, 1 Feb. - 31 Jul. 1969
Thermoelectric system of Cd-Si-Ge, and tungsten diffusion bonded lead tellurid
A Generative Framework for Low-Cost Result Validation of Outsourced Machine Learning Tasks
The growing popularity of Machine Learning (ML) has led to its deployment in
various sensitive domains, which has resulted in significant research focused
on ML security and privacy. However, in some applications, such as autonomous
driving, integrity verification of the outsourced ML workload is more
critical--a facet that has not received much attention. Existing solutions,
such as multi-party computation and proof-based systems, impose significant
computation overhead, which makes them unfit for real-time applications. We
propose Fides, a novel framework for real-time validation of outsourced ML
workloads. Fides features a novel and efficient distillation technique--Greedy
Distillation Transfer Learning--that dynamically distills and fine-tunes a
space and compute-efficient verification model for verifying the corresponding
service model while running inside a trusted execution environment. Fides
features a client-side attack detection model that uses statistical analysis
and divergence measurements to identify, with a high likelihood, if the service
model is under attack. Fides also offers a re-classification functionality that
predicts the original class whenever an attack is identified. We devised a
generative adversarial network framework for training the attack detection and
re-classification models. The evaluation shows that Fides achieves an accuracy
of up to 98% for attack detection and 94% for re-classification.Comment: 16 pages, 11 figure
Realization of Multi-Valued Logic Using Optical Quantum Computing
Quantum computing is a paradigm of computing using physical systems, which operate according to quantum mechanical principles. Since 2017, functioning quantum processing units with limited capabilities are available on the cloud. There are two models of quantum computing in the literature: discrete variable and continuous variable models. The discrete variable model is an extension of the binary logic of digital computing with quantum bits |0â© and |1â© . In the continuous variable model, the quantum state space is infinite-dimensional and the quantum state is expressed with an infinite number of basis elements.
In the physical implementation of quantum computing, however, the quantized energy levels of the electromagnetic field come in multiple values, naturally realizing the multi-valued logic of computing. Hence, to implement the discrete variable model (binary logic) of quantum computing, the temperature control is needed to restrict the energy levels to the lowest two to express the binary quantum states |0â© and |1â©. The physical realization of the continuous variable model naturally implements the multi-valued logic of computing because any physical system always has the highest level of quantized energy observed i.e., the quantum state space is always finite dimensional.
In 2001, Knill, Laflamme, and Milburn proved that linear optics realizes universal quantum computing in the qubit-based model. Optical quantum computers by Xanadu, under the phase space representation of quantum optics, naturally realizes the multi-valued logic of quantum computing at room temperature. Optical quantum computers use optical signals, which are most compatible with the fiber optics communication network. They are easily fabricable for mass production, robust to noise, and have low latency.
Optical quantum computing provides flexibility to the users for determining the dimension of the computational space for each instance of computation. Additionally, nonlinear quantum optical effects are incorporated as nonlinear quantum gates. That flexibility of user-defined dimension of the computational space and availability of nonlinear gates lead to a faithful implementation of quantum neural networks in optical quantum computing. This dissertation provides a full description of a multi-class data quantum classifier on ten classes of the MNIST dataset.
In this dissertation, I provide the background information of optical quantum computing as an ideal candidate material for building the future classical-quantum hybrid internet for its numerous benefits, among which the compatibility with the existing communications/computing infrastructure is a main one. I also show that optical quantum computing can be a hardware platform for realizing the multi- valued logic of computing without the need to encode and decode computational problems in binary logic. I also derive explicit matrix representation of optical quantum gates in the phase space representation. Using the multi-valued logic of optical quantum computing, I introduce the first quantum multi-class data classifier, classifying all ten classes of the MNIST dataset
SHI(EL)DS: A Novel Hardware-based Security Backplane to Enhance Security with Minimal Impact to System Operation
Computer security continues to increase in importance both in the commercial world and within the Air Force. Dedicated hardware for security purposes presents and enhances a number of security capabilities. Hardware enhances both the security of the security system and the quality and trustworthiness of the information being gathered by the security monitors. Hardware reduces avenues of attack on the security system and ensures the trustworthiness of information only through proper design and placement. Without careful system design, security hardware leaves itself vulnerable to many attacks that it is capable of defending against. Our SHI(EL)DS architecture combines these insights into a comprehensive, modular hardware security backplane architecture. This architecture provides many of the capabilities required by the Cybercraft deployment platform. Most importantly, it makes significant progress towards establishing a root of trust for this platform. Progressing the development of the Cybercraft initiative advances the capabilities of the Air Forceâs ability to operate in and defend cyberspace
Envisioning the Future of Cyber Security in Post-Quantum Era: A Survey on PQ Standardization, Applications, Challenges and Opportunities
The rise of quantum computers exposes vulnerabilities in current public key
cryptographic protocols, necessitating the development of secure post-quantum
(PQ) schemes. Hence, we conduct a comprehensive study on various PQ approaches,
covering the constructional design, structural vulnerabilities, and offer
security assessments, implementation evaluations, and a particular focus on
side-channel attacks. We analyze global standardization processes, evaluate
their metrics in relation to real-world applications, and primarily focus on
standardized PQ schemes, selected additional signature competition candidates,
and PQ-secure cutting-edge schemes beyond standardization. Finally, we present
visions and potential future directions for a seamless transition to the PQ
era
Finding and Evaluating Parameters for BGV
Fully Homomorphic Encryption (FHE) is a groundbreaking technology that allows for arbitrary computations to be performed on encrypted data. State-of-the-art schemes such as Brakerski Gentry Vaikuntanathan (BGV) are based on the Learning with Errors over rings (RLWE) assumption, and each ciphertext has an associated error that grows with each homomorphic operation.
For correctness, the error needs to stay below a certain threshold, requiring a trade-off between security and error margin for computations in the parameters.
Choosing the parameters accordingly, for example, the polynomial degree or the ciphertext modulus, is challenging and requires expert knowledge specific to each scheme.
In this work, we improve the parameter generation process across all steps of its process. We provide a comprehensive analysis for BGV in the Double Chinese Remainder Theorem (DCRT) representation providing more accurate and better bounds than previous work on the DCRT, and empirically derive a closed formula linking the security level, the polynomial degree, and the ciphertext modulus.
Additionally, we introduce new circuit models and combine our theoretical work in an easy-to-use parameter generator for researchers and practitioners interested in using BGV for secure computation.
Our formula results in better security estimates than previous closed formulas, while our DCRT analysis results in reduced prime sizes of up to 42% compared to previous work
- âŠ