20 research outputs found

    Low weight congestion control for multi sender applications

    Get PDF
    This paper presents a prototype for single-rate reliable multicast congestion control, which has been built into an existing commercial whiteboard. The prototype was developed using a novel scheme that was engineered around conflicting industry provided requirements for collaborative workspaces. This required the scheme to be both low-weight when used with many senders and compatible with NAT, firewalls and reflectors. The key to overcome this conflict was to combine congestion control and recovery feedback. This differs from many current solutions in that they are often designed for use with a wide variety of protocols and thus operate independent of the recovery mechanism. This paper does not go into the detail required to specify a protocol but instead discusses a few important design requirements for multi-sender applications, which are generally not considered by current research, and describes an approach towards meeting these requirements. Document type: Part of book or chapter of boo

    TransCom: a virtual disk-based cloud computing platform for heterogeneous services

    Get PDF
    PublishedJournal ArticleThis paper presents the design, implementation, and evaluation of TransCom, a virtual disk (Vdisk) based cloud computing platform that supports heterogeneous services of operating systems (OSes) and their applications in enterprise environments. In TransCom, clients store all data and software, including OS and application software, on Vdisks that correspond to disk images located on centralized servers, while computing tasks are carried out by the clients. Users can choose to boot any client for using the desired OS, including Windows, and access software and data services from Vdisks as usual without consideration of any other tasks, such as installation, maintenance, and management. By centralizing storage yet distributing computing tasks, TransCom can greatly reduce the potential system maintenance and management costs. We have implemented a multi-platform TransCom prototype that supports both Windows and Linux services. The extensive evaluation based on both test-bed experiments and real-usage experiments has demonstrated that TransCom is a feasible, scalable, and efficient solution for successful real-world use. © 2004-2012 IEEE

    INTRODUCING TRANSPARENT WEB CACHING IN A LOCAL AREA NETWORK

    Get PDF
    The term \u27transparent web caching\u27 refers to cache technology in which web traffic is automatically intercepted and redirected toward one or more cache servers. The redirection of web data can be accomplished using L4 switches or routers. Being completely transparent to the user (no browser configuration is required) the service can be easily implemented and turns out to be scalable and fail-safe. This work presents the results of our experimental use of transparent caching technology in a simple network environment. We focus on the impact this technique could have on network performance, with all its benefits and problems, as well as its effects on end users. The analysis is based on data gathered in an operative network during a two-month period

    The Need for Revisions to the Law of Wiretapping and Interception of Email

    Get PDF
    I argue that a person\u27s privacy interest in his email is the same as his privacy interest in a telephone conversation. Moreover, the privacy interest in email remains unchanged regardless of whether it is intercepted in transmission or covertly accessed from the recipient\u27s mailbox. If one accepts this assumption, it follows that the level of protection against surveillance by law enforcement officers should be the same[...] As technology continues to blur the distinction between wire and electronic communication, it becomes apparent that a new methodology must be developed in order to provide logical and consistent protection to private communications. The statutes must be revised so as to protect the privacy of communications while also providing a means by which law enforcement officers can obtain judicial approval to eavesdrop when necessary. Otherwise, increasing integration between data and voice communications will render the current statutory scheme arbitrary and impractical. By way of background, this article will discuss the law governing mail searches as well as the law of covert searches generally. This article will go on to discuss the regulation of pen registers, and will then trace the evolution of the relevant federal statutory and constitutional protections afforded to telephone conversations. Next, this article will discuss the statutory protections and the emerging case law addressing the privacy of email and other communication via computer. Particular emphasis will be placed on several recent federal court decisions that illustrate the problems arising from the current statutory scheme. Lastly, this article will discuss the controversial implementation of the FBI\u27s Carnivore software for the purpose of surreptitiously intercepting email, and the recent deployment of a keystroke-logging device as another means of learning the contents of private electronic communications. This article asserts that the Fourth Amendment protections applicable to telephone conversations set out by Katz v. United States and Berger v. New York (subsequently codified and expanded by the Federal Wiretap Act) should be implemented more broadly to encompass the surreptitious surveillance of postal mail, email, and other promising forms of electronic communication. This article argues in favor of more uniform regulation of covert surveillance of private communications regardless of the choice of technology employed to convey the message

    Remote support technology for small business

    Get PDF
    Small business is in need of a more efficient solution for managing their Information Technology support needs. Due to small business\u27s need for custom solutions, IT service providers must dedicate highly skilled personnel to client business sites, incurring high overhead costs and restricting their ability to apply their employee base to multiple clients. This restriction in cost and flexibility places a high cost burden on small business clients, straining an already limited budget. The use of remote IT support technology may provide the basis for a solution to these problems. By applying remote technology, an IT provider could centralize their employee workforce, managing clients from a single location rather than dedicating manpower to client sites. If the technology was available to support such a model, this change in the methodology could result in a more manageable solution. Small business had the highest propensity to outsource IT support for the management of their hardware, software, web hosting, server/host management, networking, and security requirements. Many remote tools currently exist to support these needs, offering solutions for access, alerts, system monitoring, diagnosis, and reporting of a client\u27s IT infrastructure. Using these tools for remote support, a remote solution showed the greatest ability to manage the software, server/host management, and networking needs of small business organizations. Web hosting service requirements were strongly supported as well, although the use of remote solutions would cause a change in the current overall structure of web hosting support, leaving the solution more difficult to implement. In the areas of hardware and security, although many of the primary needs for support were strongly addressed, flaws were discovered that made the use of the methodology less than ideal. The primary flaws of remote support resulted from the inability to manage hardware device failure, the inability to manage the network medium, and security issues resulting from the ability to separate a system administrator from the designated system through denial of service type attacks. Although each of these flaws displayed a significant issue with the use of a remote management IT solution, it was determined that the risk of each could be limited through the use of redundancy, offering a feasible work around. From both a business and a technological perspective, remote solutions proved to be a viable alternative to on-site support for the management of small business IT needs. The total cost of remote solutions is extremely comparable to the average yearly salary of an IT employee, typically offering the same potential for the support of a client\u27s IT infrastructure as a one time investment. In addition, remote solutions offer significant savings to the provider in the reduction of administrative overhead and the increased potential for business expansion, allowing for significant cost savings to be passed on to the client. Although the use of remote technology does not offer a perfect solution in its support of small business, the functionality which is readily available presents the strong potential to increase the efficiency of current small business IT support methods and offer more cost effective solutions to small business organizations

    The PEARL digital electronics lab : full access to the workbench via the web

    Get PDF
    Web-based course management and delivery software is becoming common in many areas of education, but the facilities provided by such systems do not support practical laboratory work. In many technical areas this limitation constitutes a serious restriction on the usefulness of any web-based educational frameworks, since "pen-and-paper" courseware is but a small part of the overall pedagogic materials that must be provided to the students. The system described in this paper addresses this need in the area of digital electronics and is being developed in the scope of an IST project called PEARL

    Packet analysis for network forensics: A comprehensive survey

    Get PDF
    Packet analysis is a primary traceback technique in network forensics, which, providing that the packet details captured are sufficiently detailed, can play back even the entire network traffic for a particular point in time. This can be used to find traces of nefarious online behavior, data breaches, unauthorized website access, malware infection, and intrusion attempts, and to reconstruct image files, documents, email attachments, etc. sent over the network. This paper is a comprehensive survey of the utilization of packet analysis, including deep packet inspection, in network forensics, and provides a review of AI-powered packet analysis methods with advanced network traffic classification and pattern identification capabilities. Considering that not all network information can be used in court, the types of digital evidence that might be admissible are detailed. The properties of both hardware appliances and packet analyzer software are reviewed from the perspective of their potential use in network forensics

    Bits and Bytes: The Carnivore Initiative and the Search and Seizure of Electronic Mail

    Full text link
    This Note examines the application of Fourth Amendment search and seizure doctrines to the interception of electronic mail within the context of the FBI Carnivore initiative. The author argues that the traditional law of electronic surveillance\u27s understanding of communication is outdated and never contemplated new technologies like Carnivore and their far reaching implications. Consequently, the author argues, that to protect our long-understood expectations of privacy, the search and seizure of electronic documents should be analyzed under the traditional papers analysis. To do so, the Supreme Court would afford the interception electronic documents the highest form of constitutional protect available under law

    A real time demonstrative analysis of lightweight payload encryption in resource constrained devices based on mqtt

    Get PDF
    06.03.2018 tarihli ve 30352 sayılı Resmi Gazetede yayımlanan “Yükseköğretim Kanunu İle Bazı Kanun Ve Kanun Hükmünde Kararnamelerde Değişiklik Yapılması Hakkında Kanun” ile 18.06.2018 tarihli “Lisansüstü Tezlerin Elektronik Ortamda Toplanması, Düzenlenmesi ve Erişime Açılmasına İlişkin Yönerge” gereğince tam metin erişime açılmıştır.Kısıtlı cihazların kaynakları, yani bellek (ROM ve RAM), CPU ve pil ömrü (varsa) sınırlıdır. Genellikle, veri toplayan sensörler, makinadan makineye (M2M) veya servisleri ve elektrikli ev aletlerini kontrol eden akıllı cihazlar için puanlar. Bu tür aygıtlar bir ağa bağlandığında "nesnelerin Internet'i" nin (IoT) bir parçasını oluştururlar. Message Queue Telemetry Transport (yani MQTT), hafif, açık, basit, istemci-sunucu yayın/abone mesajlaşma taşıma protokolüdür. Güvenilir iletişim için üç Hizmet Kalitesi (QoS) seviyesini destekleyen çoğu kaynak kısıtlamalı IoT cihazı için kullanışlıdır ve verimlidir. Cihazdan Cihaza (D2D) ve nesnelerin Internet'i (IoT) bağlamları gibi kısıtlı ortamlarda iletişim için gerekli olan bir protokoldür. MQTT protokolü, güvenli soket katmanı (SSL) sertifikalarına dayalı taşıma katmanı güvenliği (TLS) dışında somut güvenlik mekanizmalarından yoksundur. Bununla birlikte, bu güvenlik protokollerinin en hafif değildir ve özellikle kısıtlı cihazlar için ağ yüklerini artırır. IoT cihazlarının yaklaşık %70'inde özellikle de istemci tarafında veri şifrelemesi yoktur ve TLS için mükemmel bir alternatif olabilir. Bu tezde, farklı Hizmet Kalitesi (QoS) ve veri yüklerin değişken boyutu için kısıtlı bir cihaz üzerinde MQTT protokolünün ağ performansı üzerindeki etkisini göstermek için bir deney düzeneği tasarlanmıştır. Bu çalışmanın yeni kısmı, yüklerin istemci tarafında şifrelenmesini ve ağ performansı üzerindeki etkisini kapsıyor. Denemelerde, verilere 128-bits ileileri şifreleme standardı (AES) hafif bir şifreleme uygulanmıştır. Mesajlar, farklı yük boyutlarına dayanan bir komisyoncu sunucusu aracılığıyla gerçek kablolu alt uçtakı yayıncılık istemcisi ve düşük uçtakı abone istemcisi üzerinden MQTT'deki üç farklı QoS seviyesini kullanarak aktarılır. Paketler, şifreleme ve şifre çözme işlem süresinin ölçülmesiyle birlikte uçtan uca gecikme, verimlilik ve mesaj kaybı analiz etmek için yakalanır. Deney sonuçlarına göre, şifrelenmemiş (şifresiz metin) yükün daha düşük bir ağ yük etkisine sahip olduğu ve bu nedenle, yüzde kaybı ve mesaj tesliminde, şifreli yüke göre MQTT'yi kullanarak nispeten daha iyi bir ağ performansı ürettiği sonucuna varılmıştır.Constrained devices are limited in resources namely, memory (ROM and RAM), CPU and battery life (if available). They are often used as sensors that collects data, machine to machine (M2M) or smart devices that control services and electrical appliances. When such devices are connected to a network they form what is called "things" and in a whole, they form part of the "Internet of Things" (IoT). Message Queue Telemetry Transport (MQTT) is a common light weight, open, simple, client-server publish/subscribe messaging transport protocol useful and efficient for most resource constrained IoT devices that supports three Quality of Service (QoS) levels for reliable communication. It is an essential protocol for communication in constrained environments such as Device to Device (D2D) and Internet of Things (IoT) contexts. MQTT protocol is devoid of concrete security mechanisms apart from Transport Layer Security (TLS) based on Secure Socket Layer (SSL) certificates. However, this is not the lightest of security protocols and increases network overheads especially for constrained devices. About 70 % of most ordinary IoT devices also lack data encryption especially at the client-end which could have been a perfect alternative for TLS. In this thesis, an experimental setup is designed to demonstrate the effect on network performance of MQTT protocol on a constrained device for different Quality of Service (QoS) and variable size of payloads. The novel part of this study covers client-side encryption of payloads and its effect over network performance. In the experiments, a lightweight encryption of 128-bits Advanced Encryption Standard (AES) is applied on the data. The messages are transferred using the three different QoS levels in MQTT over real wired low-end publish client and low-end subscriber client via a broker server based on different payload sizes. The packets are captured to analyze end-to-end latency, throughput and message loss along with the measurement of encryption and decryption processing time. According to the results of the experiment, it was concluded that, non-encrypted (plaintext) payload have a lower network load effect and hence produces a relatively better network performance using MQTT in terms of percentage loss and message delivery than the encrypted payload

    Privacy and Security in the Cloud: Some Realism About Technical Solutions to Transnational Surveillance in the Post-Snowden Era

    Get PDF
    Since June 2013, the leak of thousands of classified documents regarding highly sensitive U.S. surveillance activities by former National Security Agency (NSA) contractor Edward Snowden has greatly intensified discussions of privacy, trust, and freedom in relation to the use of global computing and communication services. This is happening during a period of ongoing transition to cloud computing services by organizations, businesses, and individuals. There has always been a question of inherent in this transition: are cloud services sufficiently able to guarantee the security of their customers’ data as well s the proper restrictions on access by third parties, including governments? While worries over government access to data in the cloud is a predominate part of the ongoing debate over the use of cloud serives, the Snowden revelations highlight that intelligence agency operations pose a unique threat to the ability of services to keep their customers’ data out of the hands of domestic as well as foreign governments. The search for a proper response is ongoing, from the perspective of market players, governments, and civil society. At the technical and organizational level, industry players are responding with the wider and more sophisticated deployment of encryption as well as a new emphasis on the use of privacy enhancing technologies and innovative architectures for securing their services. These responses are the focus of this Article, which contributes to the discussion of transnational surveillance by looking at the interaction between the relevant legal frameworks on the one hand, and the possible technical and organizational responses of cloud service providers to such surveillance on the other. While the Article’s aim is to contribute to the debate about government surveillance with respect to cloud services in particular, much of the discussion is relevant for Internet services more broadly
    corecore