26,000 research outputs found

    Community Seismic Network

    Get PDF
    The article describes the design of the Community Seismic Network, which is a dense open seismic network based on low cost sensors. The inputs are from sensors hosted by volunteers from the community by direct connection to their personal computers, or through sensors built into mobile devices. The server is cloud-based for robustness and to dynamically handle the load of impulsive earthquake events. The main product of the network is a map of peak acceleration, delivered within seconds of the ground shaking. The lateral variations in the level of shaking will be valuable to first responders, and the waveform information from a dense network will allow detailed mapping of the rupture process. Sensors in buildings may be useful for monitoring the state-of-health of the structure after major shaking

    A Web-Based Distributed Virtual Educational Laboratory

    Get PDF
    Evolution and cost of measurement equipment, continuous training, and distance learning make it difficult to provide a complete set of updated workbenches to every student. For a preliminary familiarization and experimentation with instrumentation and measurement procedures, the use of virtual equipment is often considered more than sufficient from the didactic point of view, while the hands-on approach with real instrumentation and measurement systems still remains necessary to complete and refine the student's practical expertise. Creation and distribution of workbenches in networked computer laboratories therefore becomes attractive and convenient. This paper describes specification and design of a geographically distributed system based on commercially standard components

    Mobile Glaucoma Detection Application

    Get PDF
    Glaucoma is a debilitating optical degeneration disease that can lead to vision loss and eventually blindness. Given its asymptomatic nature, most people with Glaucoma aren’t even aware that they have the disease. As a result, the disease is often left untreated until it is too late. Detecting the presence of Glaucoma is one of the most important steps in treating Glaucoma, but is unfortunately also the most difficult to enforce. The Mobile Glaucoma Detection application aims to reduce the growing number of individuals who are unaware that they have Glaucoma by providing a simple detection mechanism to notify users if they are at risk. The system does this by enabling its users to independently conduct Tonometry exams through the application. Tonometry examinations allow doctors to determine if the intra-ocular pressure levels in a person’s eyes put them at risk for Glaucoma. The M.G.D.A(Mobile Glaucoma Detection Application) allows users to determine their intra-ocular pressure levels from the comfort of their own home via a special contact lens paired with a smartphone application. The system also offers users the opportunity to monitor, regulate, and track their use and progress through the system

    Estimating packet loss rate in the access through application-level measurements

    Get PDF
    End user monitoring of quality of experience is one of the necessary steps to achieve an effective and winning control over network neutrality. The involvement of the end user, however, requires the development of light and user-friendly tools that can be easily run at the application level with limited effort and network resources usage. In this paper, we propose a simple model to estimate packet loss rate perceived by a connection, by round trip time and TCP goodput samples collected at the application level. The model is derived from the well-known Mathis equation, which predicts the bandwidth of a steady-state TCP connection under random losses and delayed ACKs and it is evaluated in a testbed environment under a wide range of different conditions. Experiments are also run on real access networks. We plan to use the model to analyze the results collected by the "network neutrality bot" (Neubot), a research tool that performs application-level network-performance measurements. However, the methodology is easily portable and can be interesting for basically any user application that performs large downloads or uploads and requires to estimate access network quality and its variation

    Automated Discovery of Internet Censorship by Web Crawling

    Full text link
    Censorship of the Internet is widespread around the world. As access to the web becomes increasingly ubiquitous, filtering of this resource becomes more pervasive. Transparency about specific content that citizens are denied access to is atypical. To counter this, numerous techniques for maintaining URL filter lists have been proposed by various individuals and organisations that aim to empirical data on censorship for benefit of the public and wider censorship research community. We present a new approach for discovering filtered domains in different countries. This method is fully automated and requires no human interaction. The system uses web crawling techniques to traverse between filtered sites and implements a robust method for determining if a domain is filtered. We demonstrate the effectiveness of the approach by running experiments to search for filtered content in four different censorship regimes. Our results show that we perform better than the current state of the art and have built domain filter lists an order of magnitude larger than the most widely available public lists as of Jan 2018. Further, we build a dataset mapping the interlinking nature of blocked content between domains and exhibit the tightly networked nature of censored web resources
    • 

    corecore