17 research outputs found
pAElla: Edge-AI based Real-Time Malware Detection in Data Centers
The increasing use of Internet-of-Things (IoT) devices for monitoring a wide
spectrum of applications, along with the challenges of "big data" streaming
support they often require for data analysis, is nowadays pushing for an
increased attention to the emerging edge computing paradigm. In particular,
smart approaches to manage and analyze data directly on the network edge, are
more and more investigated, and Artificial Intelligence (AI) powered edge
computing is envisaged to be a promising direction. In this paper, we focus on
Data Centers (DCs) and Supercomputers (SCs), where a new generation of
high-resolution monitoring systems is being deployed, opening new opportunities
for analysis like anomaly detection and security, but introducing new
challenges for handling the vast amount of data it produces. In detail, we
report on a novel lightweight and scalable approach to increase the security of
DCs/SCs, that involves AI-powered edge computing on high-resolution power
consumption. The method -- called pAElla -- targets real-time Malware Detection
(MD), it runs on an out-of-band IoT-based monitoring system for DCs/SCs, and
involves Power Spectral Density of power measurements, along with AutoEncoders.
Results are promising, with an F1-score close to 1, and a False Alarm and
Malware Miss rate close to 0%. We compare our method with State-of-the-Art MD
techniques and show that, in the context of DCs/SCs, pAElla can cover a wider
range of malware, significantly outperforming SoA approaches in terms of
accuracy. Moreover, we propose a methodology for online training suitable for
DCs/SCs in production, and release open dataset and code
On the cost of computing isogenies between supersingular elliptic curves
The security of the Jao-De Feo Supersingular Isogeny Diffie-Hellman
(SIDH) key agreement scheme is based on the intractability of the
Computational Supersingular Isogeny (CSSI) problem --- computing
-rational isogenies of degrees and
between certain supersingular elliptic curves defined over
. The classical meet-in-the-middle attack on CSSI
has an expected running time of , but also has
storage requirements. In this paper, we demonstrate that the van
Oorschot-Wiener collision finding algorithm has a lower cost (but
higher running time) for solving CSSI, and thus should be used instead
of the meet-in-the-middle attack to assess the security of SIDH against
classical attacks. The smaller parameter brings significantly
improved performance for SIDH
Investigation of Parallel Data Processing Using Hybrid High Performance CPU + GPU Systems and CUDA Streams
The paper investigates parallel data processing in a hybrid CPU+GPU(s) system using multiple CUDA streams for overlapping communication and computations. This is crucial for efficient processing of data, in particular incoming data stream processing that would naturally be forwarded using multiple CUDA streams to GPUs. Performance is evaluated for various compute time to host-device communication time ratios, numbers of CUDA streams, for various numbers of threads managing computations on GPUs. Tests also reveal benefits of using CUDA MPS for overlapping communication and computations when using multiple processes. Furthermore, using standard memory allocation on a GPU and Unified Memory versions are compared, the latter including programmer added prefetching. Performance of a hybrid CPU+GPU version as well as scaling across multiple GPUs are demonstrated showing good speed-ups of the approach. Finally, the performance per power consumption of selected configurations are presented for various numbers of streams and various relative performances of GPUs and CPUs
Cryptography: Against AI and QAI Odds
Artificial Intelligence (AI) presents prodigious technological prospects for
development, however, all that glitters is not gold! The cyber-world faces the
worst nightmare with the advent of AI and quantum computers. Together with
Quantum Artificial Intelligence (QAI), they pose a catastrophic threat to
modern cryptography. It would also increase the capability of cryptanalysts
manifold, with its built-in persistent and extensive predictive intelligence.
This prediction ability incapacitates the constrained message space in device
cryptography. With the comparison of these assumptions and the intercepted
ciphertext, the code-cracking process will considerably accelerate. Before the
vigorous and robust developments in AI, we have never faced and never had to
prepare for such a plaintext-originating attack. The supremacy of AI can be
challenged by creating ciphertexts that would give the AI attacker erroneous
responses stymied by randomness and misdirect them. AI threat is deterred by
deviating from the conventional use of small, known-size keys and
pattern-loaded ciphers. The strategy is vested in implementing larger secret
size keys, supplemented by ad-hoc unilateral randomness of unbound limitations
and a pattern-devoid technique. The very large key size can be handled with low
processing and computational burden to achieve desired unicity distances. The
strategy against AI odds is feasible by implementing non-algorithmic
randomness, large and inexpensive memory chips, and wide-area communication
networks. The strength of AI, i.e., randomness and pattern detection can be
used to generate highly optimized ciphers and algorithms. These pattern-devoid,
randomness-rich ciphers also provide a timely and plausible solution for NIST's
proactive approach toward the quantum challenge
Near-Term Quantum Computing Techniques: Variational Quantum Algorithms, Error Mitigation, Circuit Compilation, Benchmarking and Classical Simulation
Quantum computing is a game-changing technology for global academia, research
centers and industries including computational science, mathematics, finance,
pharmaceutical, materials science, chemistry and cryptography. Although it has
seen a major boost in the last decade, we are still a long way from reaching
the maturity of a full-fledged quantum computer. That said, we will be in the
Noisy-Intermediate Scale Quantum (NISQ) era for a long time, working on dozens
or even thousands of qubits quantum computing systems. An outstanding
challenge, then, is to come up with an application that can reliably carry out
a nontrivial task of interest on the near-term quantum devices with
non-negligible quantum noise. To address this challenge, several near-term
quantum computing techniques, including variational quantum algorithms, error
mitigation, quantum circuit compilation and benchmarking protocols, have been
proposed to characterize and mitigate errors, and to implement algorithms with
a certain resistance to noise, so as to enhance the capabilities of near-term
quantum devices and explore the boundaries of their ability to realize useful
applications. Besides, the development of near-term quantum devices is
inseparable from the efficient classical simulation, which plays a vital role
in quantum algorithm design and verification, error-tolerant verification and
other applications. This review will provide a thorough introduction of these
near-term quantum computing techniques, report on their progress, and finally
discuss the future prospect of these techniques, which we hope will motivate
researchers to undertake additional studies in this field.Comment: Please feel free to email He-Liang Huang with any comments,
questions, suggestions or concern