2,711 research outputs found

    Configuration Management of Distributed Systems over Unreliable and Hostile Networks

    Get PDF
    Economic incentives of large criminal profits and the threat of legal consequences have pushed criminals to continuously improve their malware, especially command and control channels. This thesis applied concepts from successful malware command and control to explore the survivability and resilience of benign configuration management systems. This work expands on existing stage models of malware life cycle to contribute a new model for identifying malware concepts applicable to benign configuration management. The Hidden Master architecture is a contribution to master-agent network communication. In the Hidden Master architecture, communication between master and agent is asynchronous and can operate trough intermediate nodes. This protects the master secret key, which gives full control of all computers participating in configuration management. Multiple improvements to idempotent configuration were proposed, including the definition of the minimal base resource dependency model, simplified resource revalidation and the use of imperative general purpose language for defining idempotent configuration. Following the constructive research approach, the improvements to configuration management were designed into two prototypes. This allowed validation in laboratory testing, in two case studies and in expert interviews. In laboratory testing, the Hidden Master prototype was more resilient than leading configuration management tools in high load and low memory conditions, and against packet loss and corruption. Only the research prototype was adaptable to a network without stable topology due to the asynchronous nature of the Hidden Master architecture. The main case study used the research prototype in a complex environment to deploy a multi-room, authenticated audiovisual system for a client of an organization deploying the configuration. The case studies indicated that imperative general purpose language can be used for idempotent configuration in real life, for defining new configurations in unexpected situations using the base resources, and abstracting those using standard language features; and that such a system seems easy to learn. Potential business benefits were identified and evaluated using individual semistructured expert interviews. Respondents agreed that the models and the Hidden Master architecture could reduce costs and risks, improve developer productivity and allow faster time-to-market. Protection of master secret keys and the reduced need for incident response were seen as key drivers for improved security. Low-cost geographic scaling and leveraging file serving capabilities of commodity servers were seen to improve scaling and resiliency. Respondents identified jurisdictional legal limitations to encryption and requirements for cloud operator auditing as factors potentially limiting the full use of some concepts

    Review of Path Selection Algorithms with Link Quality and Critical Switch Aware for Heterogeneous Traffic in SDN

    Get PDF
    Software Defined Networking (SDN) introduced network management flexibility that eludes traditional network architecture. Nevertheless, the pervasive demand for various cloud computing services with different levels of Quality of Service requirements in our contemporary world made network service provisioning challenging. One of these challenges is path selection (PS) for routing heterogeneous traffic with end-to-end quality of service support specific to each traffic class. The challenge had gotten the research community\u27s attention to the extent that many PSAs were proposed. However, a gap still exists that calls for further study. This paper reviews the existing PSA and the Baseline Shortest Path Algorithms (BSPA) upon which many relevant PSA(s) are built to help identify these gaps. The paper categorizes the PSAs into four, based on their path selection criteria, (1) PSAs that use static or dynamic link quality to guide PSD, (2) PSAs that consider the criticality of switch in terms of an update operation, FlowTable limitation or port capacity to guide PSD, (3) PSAs that consider flow variabilities to guide PSD and (4) The PSAs that use ML optimization in their PSD. We then reviewed and compared the techniques\u27 design in each category against the identified SDN PSA design objectives, solution approach, BSPA, and validation approaches. Finally, the paper recommends directions for further research

    REMOVING THE MASK: VIDEO FINGERPRINTING ATTACKS OVER TOR

    Get PDF
    The Onion Router (Tor) is used by adversaries and warfighters alike to encrypt session information and gain anonymity on the internet. Since its creation in 2002, Tor has gained popularity by terrorist organizations, human traffickers, and illegal drug distributors who wish to use Tor services to mask their identity while engaging in illegal activities. Fingerprinting attacks assist in thwarting these attempts. Website fingerprinting (WF) attacks have been proven successful at linking a user to the website they have viewed over an encrypted Tor connection. With consumer video streaming traffic making up a large majority of internet traffic and sites like YouTube remaining in the top visited sites in the world, it is just as likely that adversaries are using videos to spread misinformation, illegal content, and terrorist propaganda. Video fingerprinting (VF) attacks look to use encrypted network traffic to predict the content of encrypted video sessions in closed- and open-world scenarios. This research builds upon an existing dataset of encrypted video session data and use statistical analysis to train a machine-learning classifier, using deep fingerprinting (DF), to predict videos viewed over Tor. DF is a machine learning technique that relies on the use of convolutional neural networks (CNN) and can be used to conduct VF attacks against Tor. By analyzing the results of these experiments, we can more accurately identify malicious video streaming activity over Tor.CivilianApproved for public release. Distribution is unlimited

    LKP: Simulasi Desain Load Balancing dengan Menggunakan Metode NTH

    Get PDF
    Load Balancing adalah teknik untuk mendistribusikan beban trafik pada dua atau lebih jalur koneksi secara seimbang, agar trafik dapat berjalan optimal, memaksimalkan throughput, memperkecil waktu tanggap dan menghindari overload pada salah satu jalur koneksi. Load balancing digunakan pada saat sebuah server telah memiliki jumlah user yang telah melebihi maksimal kapasitasnya. Load Balancing juga mendistribusikan beban kerja secara merata di dua atau lebih komputer, link jaringan, CPU, hard drive, atau sumber daya lainnya, untuk mendapatkan pemanfaatan sumber daya yang optimal. MEDIA DATA NUSANTARA adalah perusahaan yang berfokus pada Internet Service Provider dalam proses pemasangan seperti Wifi. Pada perusahaan ISP seperti MEDIA DATA NUSANTARA biasanya menggunakan BGP untuk mengontrol dan mengatur trafik-trafik dari sumber berbeda di dalam network multi-home (tersambung ke lebih dari 1 ISP/Internet Service Provider). BGP mempunyai skalabilitas yang tinggi dan jangkauan BGP sangat luas dalam melayani para pengguna jaringan. Karena banyaknya para pengguna jaringan (user), maka harus ada Load Balancing pada setiap user dan biasanya Media Data Nusantara menggunakan metode NTH untuk konfigurasi Load Balancing pada user. Pada Kerja Praktik ini, menggunakan dua ISP dan menjadikan sebagai Load Balancing. Mekanismenya yaitu router mikrotik akan menandai paket yang ingin mengakses internet, lalu memilih jalur ISP mana yang akan di lewatinya dan menyetarakan beban pada kedua ISP tersebut. Teknik fail overakan di terapkan juga pada jaringan ini, yaitu jika salah satu koneksi gateway sedang terputus, maka gateway yang lainnya otomatis akan menopang semua traffic jaringan dengan begitu koneksi internet pada jaringan internet tidak sepenuhnya putus. Ini dilakukan agar pengguna internet pada setiap bidang pada MEDIA DATA NUSANTARA dapat bekerja secara optimal. Oleh karena itu untuk menyelesaikan permasalahan di atas maka penulis melakukan penelitian dengan judul “Simulasi Desain Load Balancing Dengan Mengguakan Metode Nth

    Implementation of Transmission Line Fault Detection System using Long Range Wireless Sensor Networks

    Get PDF
    This paper proposes a fault detection system designed for transmission lines using Long-Range Wireless Sensor Network (LoRAWSN). The system is designed to detect and locate faults across transmission lines in real-time, which can significantly improve the reliability and efficiency of power transmission systems. A WSN will be built across transmission lines over an area. The faults identified by these sensor nodes is then transmitted to a central control unit, which analyses and displays the data. The LoRaWAN technology enables the WSN to cover long distances while consuming minimal power, making it ideal for monitoring transmission lines. The proposed fault detection system is evaluated through real world experiments, which demonstrate the feasibility and effectiveness of the proposed system. Overall, this paper presents a novel and practical approach for fault detection on transmission lines, which has the potential to improve the reliability and efficiency of power transmission systems

    Diseño e implementación de laboratorios de telecomunicaciones de redes capa 2 y capa 3 utilizando simulador para laboratorio de la Escuela de Ingeniería Mecánica Eléctrica, Facultad de Ingeniería, Universidad de San Carlos de Guatemala

    Get PDF
    Presentar el diseño e implementación de laboratorios de telecomunicaciones utilizando un simulador llamado PNETLab para facilitar equipos de redes de modo virtual, así desarrollar el diseño e implementación del simulador PNETLab para la habilitación de equipos de telecomunicaciones virtuales y proporcionar las referencias del material de apoyo para el desarrollo del proyecto

    Cybersecurity: Past, Present and Future

    Full text link
    The digital transformation has created a new digital space known as cyberspace. This new cyberspace has improved the workings of businesses, organizations, governments, society as a whole, and day to day life of an individual. With these improvements come new challenges, and one of the main challenges is security. The security of the new cyberspace is called cybersecurity. Cyberspace has created new technologies and environments such as cloud computing, smart devices, IoTs, and several others. To keep pace with these advancements in cyber technologies there is a need to expand research and develop new cybersecurity methods and tools to secure these domains and environments. This book is an effort to introduce the reader to the field of cybersecurity, highlight current issues and challenges, and provide future directions to mitigate or resolve them. The main specializations of cybersecurity covered in this book are software security, hardware security, the evolution of malware, biometrics, cyber intelligence, and cyber forensics. We must learn from the past, evolve our present and improve the future. Based on this objective, the book covers the past, present, and future of these main specializations of cybersecurity. The book also examines the upcoming areas of research in cyber intelligence, such as hybrid augmented and explainable artificial intelligence (AI). Human and AI collaboration can significantly increase the performance of a cybersecurity system. Interpreting and explaining machine learning models, i.e., explainable AI is an emerging field of study and has a lot of potentials to improve the role of AI in cybersecurity.Comment: Author's copy of the book published under ISBN: 978-620-4-74421-

    The Dark Web and Human Trafficking

    Get PDF
    This is a quantitative-comparative analysis that focuses on Artificial Intelligence (AI) platforms that assist law enforcement agencies as they combat human trafficking. Human trafficking is a Transnational Organized Crime (TOC) which means it can impact every country in the world, and in doing so, impact every person in the world. AI uses machine-learning capabilities to identify clusters, odd and/or unusual font, words, numbers, and other markers in advertisements that promote the sale of human beings. Human trafficking affects males, females, and children of all ages and can include different types of trafficking such as sex and labor trafficking. By using these AI platforms, law enforcement officers are able to identify and help more human beings than ever before in a quicker timeframe. This quantitative-comparative analysis compared Spotlight, Traffick Jam, Traffick Cam, and Domain Insight Graph (DIG) to determine if these platforms were helping law enforcement. The study revolved around the questions of accuracy, consistency, and effectiveness with each platform and found that the majority of AI platforms led the way to promote better, more efficient platforms by the same companies that learned how changes could assist law enforcement more in the future. While each platform assisted in their own ways, there were deltas in each area that leads to the need for future research in the area of AI and how it can be used to help victims of human trafficking and convict human traffickers more in later years

    3D Visualisation - An Application and Assessment for Computer Network Traffic Analysis

    Full text link
    The intent of this research is to develop and assess the application of 3D data visualisation to the field of computer security. The growth of available data relating to computer networks necessitates a more efficient and effective way of presenting information to analysts in support of decision making and situational awareness. Advances in computer hardware and display software have made more complex and interactive presentation of data in 3D possible. While many attempts at creation of data-rich 3D displays have been made in the field of computer security, they have not become the tool of choice in the industry. There is also a limited amount of published research in the assessment of these tools in comparison to 2D graphical and tabular approaches to displaying the same data. This research was conducted through creation of a novel abstraction framework for visualisation of computer network data, the Visual Interactive Network Analysis Framework (VINAF). This framework was implemented in software and the software prototype was assessed using both a procedural approach applied to a published forensics challenge and also through a human participant based experiment. The key contributions to the fields of computer security and data visualisation made by this research include the creation of a novel abstraction framework for computer network traffic which features several new visualisation approaches. An implementation of this software was developed for the specific cybersecurity related task of computer network traffic analysis and published under an open source license to the cybersecurity community. The research contributes a novel approach to human-based experimentation developed during the COVID-19 pandemic and also implemented a novel procedure-based testing approach to the assessment of the prototype data visualisation tool. Results of the research showed, through procedural experimentation, that the abstraction framework is effective for network forensics tasks and exhibited several advantages when compared to alternate approaches. The user participation experiment indicated that most of the participants deemed the abstraction framework to be effective in several task related to computer network traffic analysis. There was not a strong indication that it would be preferred over existing approaches utilised by the participants, however, it would likely be used to augment existing methods

    Design of an E-learning system using semantic information and cloud computing technologies

    Get PDF
    Humanity is currently suffering from many difficult problems that threaten the life and survival of the human race. It is very easy for all mankind to be affected, directly or indirectly, by these problems. Education is a key solution for most of them. In our thesis we tried to make use of current technologies to enhance and ease the learning process. We have designed an e-learning system based on semantic information and cloud computing, in addition to many other technologies that contribute to improving the educational process and raising the level of students. The design was built after much research on useful technology, its types, and examples of actual systems that were previously discussed by other researchers. In addition to the proposed design, an algorithm was implemented to identify topics found in large textual educational resources. It was tested and proved to be efficient against other methods. The algorithm has the ability of extracting the main topics from textual learning resources, linking related resources and generating interactive dynamic knowledge graphs. This algorithm accurately and efficiently accomplishes those tasks even for bigger books. We used Wikipedia Miner, TextRank, and Gensim within our algorithm. Our algorithm‘s accuracy was evaluated against Gensim, largely improving its accuracy. Augmenting the system design with the implemented algorithm will produce many useful services for improving the learning process such as: identifying main topics of big textual learning resources automatically and connecting them to other well defined concepts from Wikipedia, enriching current learning resources with semantic information from external sources, providing student with browsable dynamic interactive knowledge graphs, and making use of learning groups to encourage students to share their learning experiences and feedback with other learners.Programa de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Luis Sánchez Fernández.- Secretario: Luis de la Fuente Valentín.- Vocal: Norberto Fernández Garcí
    corecore