7 research outputs found

    Security analysis of a WLAN network sample in Tunja, BoyacĂĄ, Colombia

    Get PDF
    This paper presents results of a safety analysis of WLAN networks in the city of Tunja, BoyacĂĄ Colombia, it is based on a random sample distributed in all over the City. The study is a research result of the project "diagnosis of technology security, applied to a sample of organizations in the city of Tunja". It was funded by the University of Santo Tomas Sectional Tunja. The information collected and analyzed was obtained through the techniques warchalking and Wardriving, in a meaningful representation of wireless networks from public, private, educational institutions and households located geographically in different parts of the city. As a result of the research it was demonstrated different risk levels regarding certain technology configurations of devices of the public, private and residential sectors, finally some conclusions and recommendations were made to enhance the level of security through good practice to configurational level and use of these networks

    Improved Wireless Security through Physical Layer Protocol Manipulation and Radio Frequency Fingerprinting

    Get PDF
    Wireless networks are particularly vulnerable to spoofing and route poisoning attacks due to the contested transmission medium. Traditional bit-layer defenses including encryption keys and MAC address control lists are vulnerable to extraction and identity spoofing, respectively. This dissertation explores three novel strategies to leverage the wireless physical layer to improve security in low-rate wireless personal area networks. The first, physical layer protocol manipulation, identifies true transceiver design within remote devices through analysis of replies in response to packets transmitted with modified physical layer headers. Results herein demonstrate a methodology that correctly differentiates among six IEEE 802.15.4 transceiver classes with greater than 99% accuracy, regardless of claimed bit-layer identity. The second strategy, radio frequency fingerprinting, accurately identifies the true source of every wireless transmission in a network, even among devices of the same design and manufacturer. Results suggest that even low-cost signal collection receivers can achieve greater than 90% authentication accuracy within a defense system based on radio frequency fingerprinting. The third strategy, based on received signal strength quantification, can be leveraged to rapidly locate suspicious transmission sources and to perform physical security audits of critical networks. Results herein reduce mean absolute percentage error of a widely-utilized distance estimation model 20% by examining signal strength measurements from real-world networks in a military hospital and a civilian hospital

    Understanding mobile network quality and infrastructure with user-side measurements

    Get PDF
    Measurement collection is a primary step towards analyzing and optimizing performance of a telecommunication service. With an Mobile Broadband (MBB) network, the measurement process has not only to track the network’s Quality of Service (QoS) features but also to asses a user’s perspective about its service performance. The later requirement leads to “user-side measurements” which assist in discovery of performance issues that makes a user of a service unsatisfied and finally switch to another network. User-side measurements also serve as first-hand survey of the problem domain. In this thesis, we exhibit the potential in the measurements collected at network edge by considering two well-known approaches namely crowdsourced and distributed testbed-based measurements. Primary focus is on exploiting crowdsourced measurements while dealing with the challenges associated with it. These challenges consist of differences in sampling densities at different parts of the region, skewed and non-uniform measurement layouts, inaccuracy in sampling locations, differences in RSS readings due to device-diversity and other non-ideal measurement sampling characteristics. In presence of heterogeneous characteristics of the user-side measurements we propose how to accurately detect mobile coverage holes, to devise sample selection process so to generate a reliable radio map with reduced sample cost, and to identify cellular infrastructure at places where the information is not public. Finally, the thesis unveils potential of a distributed measurement test-bed in retrieving performance features from domains including user’s context, service content and network features, and understanding impact from these features upon the MBB service at the application layer. By taking web-browsing as a case study, it further presents an objective web-browsing Quality of Experience (QoE) model

    Probabilistic models for mobile phone trajectory estimation

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.Cataloged from PDF version of thesis.Includes bibliographical references (p. 157-161).This dissertation is concerned with the problem of determining the track or trajectory of a mobile device - for example, a sequence of road segments on an outdoor map, or a sequence of rooms visited inside a building - in an energy-efficient and accurate manner. GPS, the dominant positioning technology today, has two major limitations. First, it consumes significant power on mobile phones, making it impractical for continuous monitoring. Second, it does not work indoors. This dissertation develops two ways to address these limitations: (a) subsampling GPS to save energy, and (b) using alternatives to GPS such as WiFi localization, cellular localization, and inertial sensing (with the accelerometer and gyroscope) that consume less energy and work indoors. The key challenge is to match a sequence of infrequent (from sub-sampling) and inaccurate (from WiFi, cellular or inertial sensing) position samples to an accurate output trajectory. This dissertation presents three systems, all using probabilistic models, to accomplish this matching. The first, VTrack, uses Hidden Markov Models to match noisy or sparsely sampled geographic (lat, lon) coordinates to a sequence of road segments on a map. We evaluate VTrack on 800 drive hours of GPS and WiFi localization data collected from 25 taxicabs in Boston. We find that VTrack tolerates significant noise and outages in location estimates, and saves energy, while providing accurate enough trajectories for applications like travel-time aware route planning. CTrack improves on VTrack with a Markov Model that uses "soft" information in the form of raw WiFi or cellular signal strengths, rather than geographic coordinates. It also uses movement and turn "hints" from the accelerometer and compass to improve accuracy. We implement CTrack on Android phones, and evaluate it on cellular signal data from over 126 (1,074 miles) hours of driving data. CTrack can retrieve over 75% of a user's drive accurately on average, even from highly inaccurate (175 metres raw position error) GSM data. iTrack uses a particle filter to combine inertial sensing data from the accelerometer and gyroscope with WiFi signals and accurately track a mobile phone indoors. iTrack has been implemented on the iPhone, and can track a user to within less than a metre when walking with the phone in the hand or pants pocket, over 5 x more accurately than existing WiFi localization approaches. iTrack also requires very little manual effort for training, unlike existing localization systems that require a user to visit hundreds or thousands of locations in a building and mark them on a map.by Arvind Thiagarajan.Ph.D

    Reading cyberspace : fictions, figures and (dis)embodiment

    Get PDF
    My thesis tracks the human body in cyberspace as a popular cultural construct, from its origins in cyberpunk fiction in the 1980s to the pervasion of cyberspatial narratives in contemporary fictions, along with its representations within wider cultural texts, such as film, the mainstream media, and on the Internet. Across the two respective sections of the thesis, I focus upon six recurring literal-metaphorical characters, entities or motifs which serve as points of collision, entanglement and reiteration for a wide variety of discourses. These figures—the avatar, the hacker, the nanotechnological swarm, the fursona, the caring computer, and the decaying digital—have varying cultural functions in their respective representations of the human/technological interface. Informed by theorists such as Donna Haraway (1991, 2008), N. Katherine Hayles (2001) and others, I trace both their origins and their shifting and (often increasingly prolific) representations from the 1980s to the present. This allows me to uncover these figures’ registering of contextual discourses, and permits, in turn, an interrogation of the extent of their normative character, along with measuring how and to what extent, if any, these figures may offer alternative visions of human (and other) subjectivity. It also permits a rethinking of “cyberspace” itself. Section One analyses three figures that depict the human/technological interface as a space for reinscribing and reifying Cartesian dualistic views of human subjectivity, along with the exclusive and marginalising implications of the remapping of that dualism. The figures in Section One—the avatar, the hacker, and the nanotechnological swarm—have their roots in the 1980s, and have stratified over time, commonly deployed in describing the human/technological interface. These figures function in first evoking and then managing the threats to the unified masculine subject posed by the altering human/machine relationship, policing rather than collapsing the subjective boundaries between them. They maintain and reiterate their attendant logics of identity, recapitulating an image of technology as the object of human invention, and never a contributor to the substantiation of the human subject. Science fiction–especially cyberpunk—has at least partially set the terms for understanding present-day relationships between humans and technologies, and those terms are relentlessly humanistic and teleological, despite their putatively postmodern and fragmentary aesthetic. The threat of the technological other is almost invariably femininecoded, and my work in this section is explicated particularly in the light of Haraway’s work and feminist theories of embodiment, including the work of Elizabeth Grosz (1994) and Margrit Shildrick (1997, 2002). Section Two analyses three emerging figures—ones not so clearly and widely defined in fiction and popular culture—that depict the human/technological interface as fundamentally co-substantiating, rather than the latter being the product of the former. Acting as nodes of connection and constitution for various phenomena both depicted in fiction and enacted/performed at the human/technological interface itself, these three figures—the fursona, the caring computer, and the decaying digital—demonstrate potential ways to understand the human/technological interface outside of conventional, dualistic discourses of transcendental disembodiment of a bounded subject-self. Deploying theoretical work on concepts such as Alison Landsberg’s notion of prosthetic memory (2004) and Brian Massumi’s reading of the “real-material-but-incorporeal” body (2002), as well as Haraway’s later work on companion species (2008), I position these figures as representative visions of technologically-mediated subjectivity that allow us to imagine our relationships with technology as co-operative, open and materially co-substantiating. I argue that they recover the potential to rupture the unified and dualistic mind-subject that is both represented and contained by the figures seen in Section One, while reflecting a more recognisably prosaic, ongoing transformation of subjective participants in human/technological encounters. In opening up these two respective clusters of human-technological figures, I map two attendant visions of cyberspace. The first is the most common: the smooth, Euclidean grid into which the discrete unified consciousness is projected away from the body, which is conflated with (a reductive understanding of) virtuality, and to which access is allowed or denied based on highly conventional lines of gender, race, sexuality and so on. The second vision is emerging: it is possible to view cyberspace as less of a “space” at all, and more of a technologically-mediated field of material implication—one which is not discrete from the putatively offline world, which is implicit in the subject formation of its users and participants, and accounts for, rather than disavowing, the physical, bodily substrate from which it is explicated.EThOS - Electronic Theses Online ServiceArts and Humanities Research Council, Research Fellowship : Newcastle University, School of English : AHRCGBUnited Kingdo

    Guidelines and infrastructure for the design and implementation of highly adaptive, context-aware, mobile, peer-to-peer systems

    Get PDF
    Through a thorough review of existing literature, and extensive study of two large ubicomp systems, problems are identified with current mobile design practices, infrastructures and a lack of required software. From these problems, a set of guidelines for the design of mobile, peer-to-peer, context-aware systems are derived. Four key items of software infrastructure that are desirable but currently unavailable for mobile systems are identified. Each of these items of software are subsequently implemented, and the thesis describes each one, and at least one system in which each was used and trialled. These four items of mobile software infrastructure are: An 802.11 wireless driver that is capable of automatically switching between ad hoc and infrastructure networks when appropriate, combined with a peer discovery mechanism that can be used to identify peers and the services running and available on them. A hybrid positioning system that combines GPS, 802.11 and GSM positioning techniques to deliver location information that is almost constantly available, and can collect further 802.11 and GSM node samples during normal use of the system. A distributed recommendation system that, in addition to providing standard recommendations, can determine the relevance of data stored on the mobile device. This information is used by the same system to prioritise data when exchanging information with peers and to determine data that may be culled when the system is low on storage space without greatly affecting overall system performance. An infrastructure for creating highly adaptive, context-aware mobile applications. The Domino infrastructure allows software functionality to be recommended, exchanged between peers, installed, and executed, at runtime
    corecore