2,124 research outputs found

    Wireless Sensor Networks for Oceanographic Monitoring: A Systematic Review

    Get PDF
    Monitoring of the marine environment has come to be a field of scientific interest in the last ten years. The instruments used in this work have ranged from small-scale sensor networks to complex observation systems. Among small-scale networks, Wireless Sensor Networks (WSNs) are a highly attractive solution in that they are easy to deploy, operate and dismantle and are relatively inexpensive. The aim of this paper is to identify, appraise, select and synthesize all high quality research evidence relevant to the use of WSNs in oceanographic monitoring. The literature is systematically reviewed to offer an overview of the present state of this field of study and identify the principal resources that have been used to implement networks of this kind. Finally, this article details the challenges and difficulties that have to be overcome if these networks are to be successfully deployed

    Automated post-fault diagnosis of power system disturbances

    Get PDF
    In order to automate the analysis of SCADA and digital fault recorder (DFR) data for a transmission network operator in the UK, the authors have developed an industrial strength multi-agent system entitled protection engineering diagnostic agents (PEDA). The PEDA system integrates a number of legacy intelligent systems for analyzing power system data as autonomous intelligent agents. The integration achieved through multi-agent systems technology enhances the diagnostic support offered to engineers by focusing the analysis on the most pertinent DFR data based on the results of the analysis of SCADA. Since November 2004 the PEDA system has been operating online at a UK utility. In this paper the authors focus on the underlying intelligent system techniques, i.e. rule-based expert systems, model-based reasoning and state-of-the-art multi-agent system technology, that PEDA employs and the lessons learnt through its deployment and online use

    Low frequency VLBI in space using GAS-Can satellites: Report on the May 1987 JPL Workshop

    Get PDF
    Summarized are the results of a workshop held at JPL on May 28 and 29, 1987, to study the feasibility of using small, very inexpensive spacecraft for a low-frequency radio interferometer array. Many technical aspects of a mission to produce high angular resolution images of the entire sky at frequencies from 2 to 20 MHz were discussed. The workshop conclusion was that such a mission was scientifically valuable and technically practical. A useful array could be based on six or more satellites no larger than those launched from Get-Away-Special canisters. The cost of each satellite could be $1-2M, and the mass less than 90 kg. Many details require further study, but as this report shows, there is good reason to proceed. No fundamental problems have been discovered involving the use of untraditional, very inexpensive spacecraft for this type of mission

    Sub-mW Neuromorphic SNN audio processing applications with Rockpool and Xylo

    Full text link
    Spiking Neural Networks (SNNs) provide an efficient computational mechanism for temporal signal processing, especially when coupled with low-power SNN inference ASICs. SNNs have been historically difficult to configure, lacking a general method for finding solutions for arbitrary tasks. In recent years, gradient-descent optimization methods have been applied to SNNs with increasing ease. SNNs and SNN inference processors therefore offer a good platform for commercial low-power signal processing in energy constrained environments without cloud dependencies. However, to date these methods have not been accessible to ML engineers in industry, requiring graduate-level training to successfully configure a single SNN application. Here we demonstrate a convenient high-level pipeline to design, train and deploy arbitrary temporal signal processing applications to sub-mW SNN inference hardware. We apply a new straightforward SNN architecture designed for temporal signal processing, using a pyramid of synaptic time constants to extract signal features at a range of temporal scales. We demonstrate this architecture on an ambient audio classification task, deployed to the Xylo SNN inference processor in streaming mode. Our application achieves high accuracy (98%) and low latency (100ms) at low power (<100μ\muW inference power). Our approach makes training and deploying SNN applications available to ML engineers with general NN backgrounds, without requiring specific prior experience with spiking NNs. We intend for our approach to make Neuromorphic hardware and SNNs an attractive choice for commercial low-power and edge signal processing applications.Comment: This submission has been removed by arXiv administrators because the submitter did not have the authority to grant a license to the work at the time of submissio

    Understanding Digital Technology’s Evolution and the Path of Measured Productivity Growth: Present and Future in the Mirror of the Past

    Get PDF
    Three styles of explanation have been advanced by economists seeking to account for the so-called 'productivity paradox'. The coincidence of a persisting slowdown in the growth of measured total factor productivity (TFP) in the US, since the mid-1970's, with the wave of information technology (It) innovations, is said by some to be an illusion due to the mismeasurement of real output growth; by others to expose the mistaken expectations about the benefits of computerization; and by still others to reflect the amount of time, and the volume of intangible investments in 'learning', and the time required for ancillary innovations that allow the new digital technologies to be applied in ways that are reflected in measured productivity growth. This paper shows that rather than viewing these as competing hypotheses, the dynamics of the transition to a new technological and economic regime based upon a general purpose technology (GPT) should be understood to be likely to give rise to all three 'effects.' It more fully articulates and supports this thesis, which was first advanced in the 'computer and dynamo' papers by David (1990, 1991). The relevance of that historical experience is re-asserted and supported by further evidence rebutting skeptics who have argued that the diffusion of electrification and computerization have little in common. New evidence is produced about the links between IT use, mass customization, and the upward bias of output price deflators arising from the method used to 'chain in' new products prices. The measurement bias due to the exclusion of intangible investments from the scope of the official national product accounts also is examined. Further, it is argued that the development of the general-purpose PC delayed the re-organization of businesses along lines that would have more directly raised task productivity, even though the technologies yielded positive 'revenue productivity' gains for large companies. The paper concludes by indicating the emerging technical and organizational developments that are likely to deliver a sustained surge of measured TFP growth during the decades that lie immediately ahead.

    Web-based relay management with biometric authentication

    Get PDF
    This thesis proposes a web-based system for managing digital relay settings. These relays are deployed in the power system to protect sensitive and expensive equipment from physical damage during system faults and overload conditions. Providing this capability exposes these devices to the same cyber security threats that corporations have faced for many years.;This thesis investigates the risks and requirements for deploying the proposed system. A breakdown in the protection that these relays provide would cause power outages. The cost of outages can be significant. Therefore cyber security is critical in the system design. Cyber security requirements for the power industry identify access control as an important aspect for the protection of its infrastructure. If properly implemented, biometrics can be used to strengthen access control to computer systems.;The web-based relay management system uses fingerprint authentication along with a username and password to provide access control. Website users are given access to functionality based on user roles. Only high level users may attempt relay setting modification. The relay management system interacts with a database that stores the current relay settings, relay setting restrictions, and a queue of relay updates. A process is implemented to verify attempted setting changes against these setting restrictions. This provides an extra security layer if users attempt harmful changes to protection schemes. Valid setting changes are added to the queue and a separate relay update program communicates these changes to the relay. The database and relay update program protect the relays from direct modification. These features combined with biometric authentication provide a strong layered scheme for protecting relays, while supplying an easy to use interface for remotely using their capabilities

    Integration of a cellular Internet-of-Things transceiver into 6G test network and evaluation of its performance

    Get PDF
    Abstract. This thesis focuses on the integration and deployment of an aftermarket cellular IoT transceiver on a 6G/5G test network for the purpose of evaluating the feasibility of such device for monitoring the network performance. The cellular technology employed was NB-IoT paired with a Raspberry Pi device as the microprocessor that collects network telemetry and uses MQTT protocol to provide constant data feed. The system was first tested in a public cellular network through a local service provider and was successfully connected to the network, establishing TCP/IP connections, and allowing internet connectivity. To monitor network information and gathering basic telemetry data, a network monitoring utility was developed. It collected data such as network identifiers, module registration status, band/channel, signal strength and GPS position. This data was then published to a MQTT broker. The Adafruit IO platform served as the MQTT broker, providing an interface to visualize the collected data. Furthermore, the system was configured for and deployed on a 6G/5G test network successfully. The device functionality that was developed and tested in the public network remained intact, enabling continuous monitoring and analysis of network data. Through this study, valuable insights into the integration and deployment of cellular IoT transceivers into cellular networks that employ the latest IoT technology were gained. The findings highlight the feasibility of utilizing such a system for network monitoring and demonstrate the potential for IoT applications in cellular networks
    corecore