2,295 research outputs found

    LEDAkem: a post-quantum key encapsulation mechanism based on QC-LDPC codes

    Full text link
    This work presents a new code-based key encapsulation mechanism (KEM) called LEDAkem. It is built on the Niederreiter cryptosystem and relies on quasi-cyclic low-density parity-check codes as secret codes, providing high decoding speeds and compact keypairs. LEDAkem uses ephemeral keys to foil known statistical attacks, and takes advantage of a new decoding algorithm that provides faster decoding than the classical bit-flipping decoder commonly adopted in this kind of systems. The main attacks against LEDAkem are investigated, taking into account quantum speedups. Some instances of LEDAkem are designed to achieve different security levels against classical and quantum computers. Some performance figures obtained through an efficient C99 implementation of LEDAkem are provided.Comment: 21 pages, 3 table

    Wolf Search Algorithm for Solving Optimal Reactive Power Dispatch Problem

    Get PDF
    This paper presents a new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA) for solving the multi-objective reactive power dispatch problem. Wolf Search algorithm is a new bio – inspired heuristic algorithm which based on wolf preying behaviour. The way wolves search for food and survive by avoiding their enemies has been imitated to formulate the algorithm for solving the reactive power dispatches. And the speciality  of wolf is  possessing  both individual local searching ability and autonomous flocking movement and this special property has been utilized to formulate the search algorithm .The proposed (WSA) algorithm has been tested on standard IEEE 30 bus test system and simulation results shows clearly about the good performance of the proposed algorithm

    Stigmergic hyperlink's contributes to web search

    Get PDF
    Stigmergic hyperlinks are hyperlinks with a "heart beat": if used they stay healthy and online; if neglected, they fade, eventually getting replaced. Their life attribute is a relative usage measure that regular hyperlinks do not provide, hence PageRank-like measures have historically been well informed about the structure of webs of documents, but unaware of what users effectively do with the links. This paper elaborates on how to input the users’ perspective into Google’s original, structure centric, PageRank metric. The discussion then bridges to the Deep Web, some search challenges, and how stigmergic hyperlinks could help decentralize the search experience, facilitating user generated search solutions and supporting new related business models.info:eu-repo/semantics/publishedVersio

    TRUSS STRUCTURE OPTIMIZATION BASED ON IMPROVED WOLF PACK ALGORITHM

    Get PDF
    Aiming at the optimization of truss structure, a wolf pack algorithm based on chaos and improved search strategy was proposed. The mathematical model of truss optimization was constructed, and the classical truss structure was optimized. The results were compared with those of other optimization algorithms. When selecting and updating the initial position of wolves, chaos idea was used to distribute the initial value evenly in the solution space; phase factor was introduced to optimize the formula of wolf detection; information interaction between wolves is increased and the number of runs is reduced. The numerical results show that the improved wolf pack algorithm has the characteristics of fewer parameters, simple programming, easy implementation, fast convergence speed, and can quickly find the optimal solution. It is suitable for the optimization design of the section size of space truss structures

    Classical gully spatial identification and slope stability modeling using high-resolution elevation and data mining technique

    Get PDF
    It is widely known that soil erosion is an issue of concern in soil and water quality, affecting agriculture and natural resources. Thus, scientific efforts must take into consideration the high-resolution elevation dataset in order to implement a precision conservation approach effectively. New advances such as LiDAR products have provided a basic source of information to enable researchers to identify small erosional landscape features. To fill this gap, this study developed a methodology based on data mining of hydrologic and topographic attributes associated with concentrated flow path identification to distinguish classic gully side walls and bed areas. At 0.91 Km2 region of the Keigley Branch-South Skunk River watershed, an area with gullies, we computed profile curvature, mean slope deviation, stream power index, and aspect gridded in 1-m pixel from Iowa LiDAR project. CLARA (CLustering LARge Applications) algorithm. An unsupervised clustering approach was employed on 913,495 points splitting the dataset in six groups, the number in agreement with within-group sum of squared error (WSS) statistical technique. In addition, a new threshold criteria termed gully concentrated flow (GCF) based upon data distribution of flow accumulation and mean overall slope were introduced to produce polylines that identified the main hydrographic flow paths, corresponding to the gully beds. Cluster #6 was classified as gully side walls. After distinguishing gullies and cliffs areas among points belonging to cluster 6, all six gullies were satisfactorily identified. The proposed methodology improves on existent techniques because identifies distinct parts of gullies which include side walls and bed zone. Another important concept is assessing gully slope stability in order to generate useful information for precision conservation planning. Although limit-equilibrium concept has been used widely in rock mechanics its application in precision conservation structures is relatively new. This study evaluated two multi-temporal surveys in a Western Iowa gullied area under the approach of soil stability regarding precision conservation practice The study computed factor of safety (FS) at the gully area, including headcut and gully side walls using digital elevation models originated from surveys conducted in 1999 and 2014. Outcomes of this assessment have revealed significantly less instability of the actual slopes compared to 1999 survey slopes. The internal friction angle (θ) had the largest effect on slope stability factor (S.D.1999 = 0.18, S.D.2014 = 0.24), according the sensitivity analysis, compared to variations of soil cohesion, failure plane angle and slab thickness. In addition, critically instable slopes within gully, based on units of the slope standard deviation, as a threshold, have produced an area of 61 m2 and 396 m2 considering the threshold of one and two slope standard deviation, respectively. The majority of these critical areas were located near the headcut and in the border of side walls. Based on current literature, association of processed material (geotextile) and crop cover with high root density might be an alternative to improve slope instability, but empirical tests are necessary to validate this approach. Nevertheless, the slope instability must include other factors that capture the dynamics of failure

    A Review of Deep Reinforcement Learning in Serverless Computing: Function Scheduling and Resource Auto-Scaling

    Full text link
    In the rapidly evolving field of serverless computing, efficient function scheduling and resource scaling are critical for optimizing performance and cost. This paper presents a comprehensive review of the application of Deep Reinforcement Learning (DRL) techniques in these areas. We begin by providing an overview of serverless computing, highlighting its benefits and challenges, with a particular focus on function scheduling and resource scaling. We then delve into the principles of deep reinforcement learning (DRL) and its potential for addressing these challenges. A systematic review of recent studies applying DRL to serverless computing is presented, covering various algorithms, models, and performances. Our analysis reveals that DRL, with its ability to learn and adapt from an environment, shows promising results in improving the efficiency of function scheduling and resource scaling in serverless computing. However, several challenges remain, including the need for more realistic simulation environments, handling of cold starts, and the trade-off between learning time and scheduling performance. We conclude by discussing potential future directions for this research area, emphasizing the need for more robust DRL models, better benchmarking methods, and the exploration of multi-agent reinforcement learning for more complex serverless architectures. This review serves as a valuable resource for researchers and practitioners aiming to understand and advance the application of DRL in serverless computing

    Metaheuristics and Chaos Theory

    Get PDF
    Chaos theory is a novelty approach that has been widely used into various applications. One of the famous applications is the introduction of chaos theory into optimization. Note that chaos theory is highly sensitive to initial condition and has the feature of randomness. As chaos theory has the feature of randomness and dynamical properties, it is easy to accelerate the optimization algorithm convergence and enhance the capability of diversity. In this work, we integrated 10 chaotic maps into several metaheuristic algorithms in order to extensively investigate the effectiveness of chaos theory for improving the search capability. Extensive experiments have been carried out and the results have shown that chaotic optimization can be a very promising tool for solving optimization algorithms
    corecore