14 research outputs found

    Deadlock avoidance with virtual channels

    Get PDF
    High Performance Computing is a rapidly evolving area of computer science which attends to solve complicated computational problems with the combination of computational nodes connected through high speed networks. This work concentrates on the networks problems that appear in such networks and specially focuses on the Deadlock problem that can decrease the efficiency of the communication or even destroy the balance and paralyze the network. Goal of this work is the Deadlock avoidance with the use of virtual channels, in the switches of the network where the problem appears. The deadlock avoidance assures that will not be loss of data inside network, having as result the increased latency of the served packets, due to the extra calculation that the switches have to make to apply the policy.La computación de alto rendimiento es una zona de rápida evolución de la informática que busca resolver complicados problemas de cálculo con la combinación de los nodos de cómputo conectados a través de redes de alta velocidad. Este trabajo se centra en los problemas de las redes que aparecen en este tipo de sistemas y especialmente se centra en el problema del "deadlock" que puede disminuir la eficacia de la comunicación con la paralización de la red. El objetivo de este trabajo es la evitación de deadlock con el uso de canales virtuales, en los conmutadores de la red donde aparece el problema. Evitar el deadlock asegura que no se producirá la pérdida de datos en red, teniendo como resultado el aumento de la latencia de los paquetes, debido al overhead extra de cálculo que los conmutadores tienen que hacer para aplicar la política.La computació d'alt rendiment és una àrea de ràpida evolució de la informàtica que pretén resoldre complicats problemes de càlcul amb la combinació de nodes de còmput connectats a través de xarxes d'alta velocitat. Aquest treball se centra en els problemes de les xarxes que apareixen en aquest tipus de sistemes i especialment se centra en el problema del "deadlock" que pot disminuir l'eficàcia de la comunicació amb la paralització de la xarxa. L'objectiu d'aquest treball és l'evitació de deadlock amb l'ús de canals virtuals, en els commutadors de la xarxa on apareix el problema. Evitar deadlock assegura que no es produirà la pèrdua de dades en xarxa, tenint com a resultat l'augment de la latència dels paquets, degut al overhead extra de càlcul que els commutadors han de fer per aplicar la política

    Development of High Performance Molecular Dynamics with Application to Multimillion-Atom Biomass Simulations

    Get PDF
    An understanding of the recalcitrance of plant biomass is important for efficient economic production of biofuel. Lignins are hydrophobic, branched polymers and form a residual barrier to effective hydrolysis of lignocellulosic biomass. Understanding lignin\u27s structure, dynamics and its interaction and binding to cellulose will help with finding more efficient ways to reduce its contribution to the recalcitrance. Molecular dynamics (MD) using the GROMACS software is employed to study these properties in atomic detail. Studying complex, realistic models of pretreated plant cell walls, requires simulations significantly larger than was possible before. The most challenging part of such large simulations is the computation of the electrostatic interaction. As a solution, the reaction-field (RF) method has been shown to give accurate results for lignocellulose systems, as well as good computational efficiency on leadership class supercomputers. The particle-mesh Ewald method has been improved by implementing 2D decomposition and thread level parallelization for molecules not accurately modeled by RF. Other scaling limiting computational components, such as the load balancing and memory requirements, were identified and addressed to allow such large scale simulations for the first time. This work was done with the help of modern software engineering principles, including code-review, continuous integration, and integrated development environments. These methods were adapted to the special requirements for scientific codes. Multiple simulations of lignocellulose were performed. The simulation presented primarily, explains the temperature-dependent structure and dynamics of individual softwood lignin polymers in aqueous solution. With decreasing temperature, the lignins are found to transition from mobile, extended to glassy, compact states. The low-temperature collapse is thermodynamically driven by the increase of the translational entropy and density fluctuations of water molecules removed from the hydration shell

    Full Issue: Volume 14, Number 1, Fall 2020

    Get PDF
    Complete .pdf file of Volume 14, number 1 of The Science Journal of the Lander College of Arts and Sciences. Published Fall 2020

    Machine Learning-Enabled Resource Allocation for Underlay Cognitive Radio Networks

    Get PDF
    Due to the rapid growth of new wireless communication services and applications, much attention has been directed to frequency spectrum resources and the way they are regulated. Considering that the radio spectrum is a natural limited resource, supporting the ever increasing demands for higher capacity and higher data rates for diverse sets of users, services and applications is a challenging task which requires innovative technologies capable of providing new ways of efficiently exploiting the available radio spectrum. Consequently, dynamic spectrum access (DSA) has been proposed as a replacement for static spectrum allocation policies. The DSA is implemented in three modes including interweave, overlay and underlay mode [1]. The key enabling technology for DSA is cognitive radio (CR), which is among the core prominent technologies for the next generation of wireless communication systems. Unlike conventional radio which is restricted to only operate in designated spectrum bands, a CR has the capability to operate in different spectrum bands owing to its ability in sensing, understanding its wireless environment, learning from past experiences and proactively changing the transmission parameters as needed. These features for CR are provided by an intelligent software package called the cognitive engine (CE). In general, the CE manages radio resources to accomplish cognitive functionalities and allocates and adapts the radio resources to optimize the performance of the network. Cognitive functionality of the CE can be achieved by leveraging machine learning techniques. Therefore, this thesis explores the application of two machine learning techniques in enabling the cognition capability of CE. The two considered machine learning techniques are neural network-based supervised learning and reinforcement learning. Specifically, this thesis develops resource allocation algorithms that leverage the use of machine learning techniques to find the solution to the resource allocation problem for heterogeneous underlay cognitive radio networks (CRNs). The proposed algorithms are evaluated under extensive simulation runs. The first resource allocation algorithm uses a neural network-based learning paradigm to present a fully autonomous and distributed underlay DSA scheme where each CR operates based on predicting its transmission effect on a primary network (PN). The scheme is based on a CE with an artificial neural network that predicts the adaptive modulation and coding configuration for the primary link nearest to a transmitting CR, without exchanging information between primary and secondary networks. By managing the effect of the secondary network (SN) on the primary network, the presented technique maintains the relative average throughput change in the primary network within a prescribed maximum value, while also finding transmit settings for the CRs that result in throughput as large as allowed by the primary network interference limit. The second resource allocation algorithm uses reinforcement learning and aims at distributively maximizing the average quality of experience (QoE) across transmission of CRs with different types of traffic while satisfying a primary network interference constraint. To best satisfy the QoE requirements of the delay-sensitive type of traffics, a cross-layer resource allocation algorithm is derived and its performance is compared against a physical-layer algorithm in terms of meeting end-to-end traffic delay constraints. Moreover, to accelerate the learning performance of the presented algorithms, the idea of transfer learning is integrated. The philosophy behind transfer learning is to allow well-established and expert cognitive agents (i.e. base stations or mobile stations in the context of wireless communications) to teach newly activated and naive agents. Exchange of learned information is used to improve the learning performance of a distributed CR network. This thesis further identifies the best practices to transfer knowledge between CRs so as to reduce the communication overhead. The investigations in this thesis propose a novel technique which is able to accurately predict the modulation scheme and channel coding rate used in a primary link without the need to exchange information between the two networks (e.g. access to feedback channels), while succeeding in the main goal of determining the transmit power of the CRs such that the interference they create remains below the maximum threshold that the primary network can sustain with minimal effect on the average throughput. The investigations in this thesis also provide a physical-layer as well as a cross-layer machine learning-based algorithms to address the challenge of resource allocation in underlay cognitive radio networks, resulting in better learning performance and reduced communication overhead

    Democratic Exclusion: The Right to Vote in the United States, United Kingdom, and France

    Get PDF
    This research focuses on the forms of exclusion that democratizing processes have historically facilitated. The dynamics of democratization often lead political coalitions to change electoral rules to simultaneously extend and constrict the right to vote across different categories of persons, as well as to reinforce existing exclusions. This pattern occurred in all the \u27exemplary models\u27 of early democratization, and yet the historical narratives relied on by the comparative democratization literature neglect its exclusionary dimension, and thereby misinform comparative theory building. The dissertation empirically documents the dark side of democratization in the three paradigmatic cases of the United States, the United Kingdom, and France, and develops and tests a theory explaining cross-national and cross-time variation. At key moments in a country\u27s development, political entrepreneurs advance ideas of community belonging for the purpose of securing a governing coalition. When successful the ideas of political community are embedded in new institutions and in public opinion, shaping the expectations of political agents across the political spectrum and resulting in higher costs of coalition-building and political mobilization across categories of people. The exclusions were thereby made resilient to subsequent democratizing processes. The dissertation advances research the role of ideas in social science by focusing on the micro-foundations of democratic exclusion. The model predicts various of political behavior that are integrally important to democratization, and is tested against debates, voting behavior, and correspondence in and outside of parliaments, legislatures, and constitutional conventions. The data draws on archival field work research, multiple datasets of legislator behavior, constituency demographics, and institutional change. These allow for the identification of stable patterns as well as change across time, and supplement a process tracing research design

    Necessary Condition Analysis (NCA): Logic and Methodology of “Necessary but Not Sufficient” Causality

    Get PDF
    Theoretical “necessary but not sufficient” statements are common in the organizational sciences. Traditional data analyses approaches (e.g., correlation or multiple regression) are not appropriate for testing or inducing such statements. This article proposes necessary condition analysis (NCA) as a general and straightforward methodology for identifying necessary conditions in data sets. The article presents the logic and methodology of necessary but not sufficient contributions of organizational determinants (e.g., events, characteristics, resources, efforts) to a desired outcome (e.g., good performance). A necessary determinant must be present for achieving an outcome, but its presence is not sufficient to obtain that outcome. Without the necessary condition, there is guaranteed failure, which cannot be compensated by other determinants of the outcome. This logic and its related methodology are fundamentally different from the traditional sufficiency-based logic and methodology. Practical recommendations and free software are offered to support researcher

    Solutions for large scale, efficient, and secure Internet of Things

    Get PDF
    The design of a general architecture for the Internet of Things (IoT) is a complex task, due to the heterogeneity of devices, communication technologies, and applications that are part of such systems. Therefore, there are significant opportunities to improve the state of the art, whether to better the performance of the system, or to solve actual issues in current systems. This thesis focuses, in particular, on three aspects of the IoT. First, issues of cyber-physical systems are analysed. In these systems, IoT technologies are widely used to monitor, control, and act on physical entities. One of the most important issue in these scenarios are related to the communication layer, which must be characterized by high reliability, low latency, and high energy efficiency. Some solutions for the channel access scheme of such systems are proposed, each tailored to different specific scenarios. These solutions, which exploit the capabilities of state of the art radio transceivers, prove effective in improving the performance of the considered systems. Positioning services for cyber-physical systems are also investigated, in order to improve the accuracy of such services. Next, the focus moves to network and service optimization for traffic intensive applications, such as video streaming. This type of traffic is common amongst non-constrained devices, like smartphones and augmented/virtual reality headsets, which form an integral part of the IoT ecosystem. The proposed solutions are able to increase the video Quality of Experience while wasting less bandwidth than state of the art strategies. Finally, the security of IoT systems is investigated. While often overlooked, this aspect is fundamental to enable the ubiquitous deployment of IoT. Therefore, security issues of commonly used IoT protocols are presented, together with a proposal for an authentication mechanism based on physical channel features. This authentication strategy proved to be effective as a standalone mechanism or as an additional security layer to improve the security level of legacy systems

    The Eastern Question and the fallacy of modernity on the premodern origins of the modern inter-state order in southeastern Europe

    Get PDF
    The ‘eastern question’ of the 19th century is conventionally understood as the power-vacuum created by the decay of the geostrategically important Ottoman Empire in the context of a highly competitive and expansionary European inter-state system. Conventional approaches to International Relations argue that the eastern question was solved by creating multiple, legitimate, sovereign national states in lieu of Ottoman rule as the outcome of an expanding European modernity, replacing the outdated, illegitimate and despotic rule of Oriental princes. However, this assumption entails a tension between the supposedly universal scope of European modernity and its fractured, multi-national form of transmission. This contradiction, implicit in International Relations theory, is the subject of this thesis. Examining this problem in the light of the eastern question, this thesis offers a historical sociological reconstruction of the social transformations that produced the supposedly ‘modern’ geopolitical ‘order’ in Southeastern Europe. The critical re-reading and positive reconstruction of the Ottoman trajectory from the end of territorial expansion in 1683 to the Greek secession in 1821, problematizes in how far territorial fragmentation of political rule can be understood as the ‘logical’ result of the expansion of ‘modern’ social and political relations. It is argued that, instead of understanding these developments as a teleological and predetermined process of Westernization, the key for understanding the emergence of the post-Ottoman state system lies in deciphering the dialectic between a ‘domestic’ social struggle among pre-capitalist classes and an intensifying pan-European geopolitical dynamic. Hence, rather than understanding the process of nation-formation as the inevitable result of the expansion of ‘modern’ international relations, it is necessary to emphasize the specificity of the Ottoman, like any other transformation. This in turn helps illuminating the unnatural and malleable nature of ‘modern’ territorial inter-national ‘orders’. Rather than implementing a just, natural or finite domestic and geopolitical order, ‘national’ fragmentations result from specific, materially conditioned social struggles. This raises generic problems with static and ahistorical understandings of social and geopolitical relations. It is suggested that a theoretically open historical materialist sociology of International Relations can provide a remedy. In consequence, it is argued that the ‘eastern question’, far from being solved by the formation of national states, still remains open to this day
    corecore