18,000 research outputs found

    ART Neural Networks: Distributed Coding and ARTMAP Applications

    Full text link
    ART (Adaptive Resonance Theory) neural networks for fast, stable learning and prediction have been applied in a variety of areas. Applications include airplane design and manufacturing, automatic target recognition, financial forecasting, machine tool monitoring, digital circuit design, chemical analysis, and robot vision. Supervised ART architectures, called ARTMAP systems, feature internal control mechanisms that create stable recognition categories of optimal size by maximizing code compression while minimizing predictive error in an on-line setting. Special-purpose requirements of various application domains have led to a number of ARTMAP variants, including fuzzy ARTMAP, ART-EMAP, Gaussian ARTMAP, and distributed ARTMAP. ARTMAP has been used for a variety of applications, including computer-assisted medical diagnosis. Medical databases present many of the challenges found in general information management settings where speed, efficiency, ease of use, and accuracy are at a premium. A direct goal of improved computer-assisted medicine is to help deliver quality emergency care in situations that may be less than ideal. Working with these problems has stimulated a number of ART architecture developments, including ARTMAP-IC [1]. This paper describes a recent collaborative effort, using a new cardiac care database for system development, has brought together medical statisticians and clinicians at the New England Medical Center with researchers developing expert systems and neural networks, in order to create a hybrid method for medical diagnosis. The paper also considers new neural network architectures, including distributed ART {dART), a real-time model of parallel distributed pattern learning that permits fast as well as slow adaptation, without catastrophic forgetting. Local synaptic computations in the dART model quantitatively match the paradoxical phenomenon of Markram-Tsodyks [2] redistribution of synaptic efficacy, as a consequence of global system hypotheses.Office of Naval Research (N00014-95-1-0409, N00014-95-1-0657

    Forensic Methods and Tools for Web Environments

    Get PDF
    abstract: The Web is one of the most exciting and dynamic areas of development in today’s technology. However, with such activity, innovation, and ubiquity have come a set of new challenges for digital forensic examiners, making their jobs even more difficult. For examiners to become as effective with evidence from the Web as they currently are with more traditional evidence, they need (1) methods that guide them to know how to approach this new type of evidence and (2) tools that accommodate web environments’ unique characteristics. In this dissertation, I present my research to alleviate the difficulties forensic examiners currently face with respect to evidence originating from web environments. First, I introduce a framework for web environment forensics, which elaborates on and addresses the key challenges examiners face and outlines a method for how to approach web-based evidence. Next, I describe my work to identify extensions installed on encrypted web thin clients using only a sound understanding of these systems’ inner workings and the metadata of the encrypted files. Finally, I discuss my approach to reconstructing the timeline of events on encrypted web thin clients by using service provider APIs as a proxy for directly analyzing the device. In each of these research areas, I also introduce structured formats that I customized to accommodate the unique features of the evidence sources while also facilitating tool interoperability and information sharing.Dissertation/ThesisDoctoral Dissertation Computer Science 201

    Preliminary Candidate Advanced Avionics System (PCAAS)

    Get PDF
    Specifications which define the system functional requirements, the subsystem and interface needs, and other requirements such as maintainability, modularity, and reliability are summarized. A design definition of all required avionics functions and a system risk analysis are presented

    Identification of high-level functional/system requirements for future civil transports

    Get PDF
    In order to accommodate the rapid growth in commercial aviation throughout the remainder of this century, the Federal Aviation Administration (FAA) is faced with a formidable challenge to upgrade and/or modernize the National Airspace System (NAS) without compromising safety or efficiency. A recurring theme in both the Aviation System Capital Investment Plan (CIP), which has replaced the NAS Plan, and the new FAA Plan for Research, Engineering, and Development (RE&D) rely on the application of new technologies and a greater use of automation. Identifying the high-level functional and system impacts of such modernization efforts on future civil transport operational requirements, particularly in terms of cockpit functionality and information transfer, was the primary objective of this project. The FAA planning documents for the NAS of the 2005 era and beyond were surveyed; major aircraft functional capabilities and system components required for such an operating environment were identified. A hierarchical structured analysis of the information processing and flows emanating from such functional/system components were conducted and the results documented in graphical form depicting the relationships between functions and systems

    Simulating Real-Time Aspects of Wireless Sensor Networks

    Get PDF
    Wireless Sensor Networks (WSNs) technology has been mainly used in the applications with low-frequency sampling and little computational complexity. Recently, new classes of WSN-based applications with different characteristics are being considered, including process control, industrial automation and visual surveillance. Such new applications usually involve relatively heavy computations and also present real-time requirements as bounded end-to- end delay and guaranteed Quality of Service. It becomes then necessary to employ proper resource management policies, not only for communication resources but also jointly for computing resources, in the design and development of such WSN-based applications. In this context, simulation can play a critical role, together with analytical models, for validating a system design against the parameters of Quality of Service demanded for. In this paper, we present RTNS, a publicly available free simulation tool which includes Operating System aspects in wireless distributed applications. RTNS extends the well-known NS-2 simulator with models of the CPU, the Real-Time Operating System and the application tasks, to take into account delays due to the computation in addition to the communication. We demonstrate the benefits of RTNS by presenting our simulation study for a complex WSN-based multi-view vision system for real-time event detection

    A Smartphone-Based System for Outdoor Data Gathering Using a Wireless Beacon Network and GPS Data: From Cyber Spaces to Senseable Spaces

    Get PDF
    Information and Communication Technologies (ICTs) and mobile devices are deeply influencing all facets of life, directly affecting the way people experience space and time. ICTs are also tools for supporting urban development, and they have also been adopted as equipment for furnishing public spaces. Hence, ICTs have created a new paradigm of hybrid space that can be defined as Senseable Spaces. Even if there are relevant cases where the adoption of ICT has made the use of public open spaces more “smart”, the interrelation and the recognition of added value need to be further developed. This is one of the motivations for the research presented in this paper. The main goal of the work reported here is the deployment of a system composed of three different connected elements (a real-world infrastructure, a data gathering system, and a data processing and analysis platform) for analysis of human behavior in the open space of Cardeto Park, in Ancona, Italy. For this purpose, and because of the complexity of this task, several actions have been carried out: the deployment of a complete real-world infrastructure in Cardeto Park, the implementation of an ad-hoc smartphone application for the gathering of participants’ data, and the development of a data pre-processing and analysis system for dealing with all the gathered data. A detailed description of these three aspects and the way in which they are connected to create a unique system is the main focus of this paper.This work has been supported by the Cost Action TU1306, called CYBERPARKS: Fostering knowledge about the relationship between Information and Communication Technologies and Public Spaces supported by strategies to improve their use and attractiveness, the Spanish Ministry of Economy and Competitiveness under the ESPHIA project (ref. TIN2014-56042-JIN) and the TARSIUS project (ref. TIN2015-71564-C4-4-R), and the Basque Country Department of Education under the BLUE project (ref. PI-2016-0010). The authors would also like to thank the staff of UbiSive s.r.l. for the support in developing the application

    The 30/20 GHz flight experiment system, phase 2. Volume 2: Experiment system description

    Get PDF
    A detailed technical description of the 30/20 GHz flight experiment system is presented. The overall communication system is described with performance analyses, communication operations, and experiment plans. Hardware descriptions of the payload are given with the tradeoff studies that led to the final design. The spacecraft bus which carries the payload is discussed and its interface with the launch vehicle system is described. Finally, the hardwares and the operations of the terrestrial segment are presented

    HSP-Wrap: The Design and Evaluation of Reusable Parallelism for a Subclass of Data-Intensive Applications

    Get PDF
    There is an increasing gap between the rate at which data is generated by scientific and non-scientific fields and the rate at which data can be processed by available computing resources. In this paper, we introduce the fields of Bioinformatics and Cheminformatics; two fields where big data has become a problem due to continuing advances in the technologies that drives these fields: such as gene sequencing and small ligand exploration. We introduce high performance computing as a means to process this growing base of data in order to facilitate knowledge discovery. We enumerate goals of the project including reusability, efficiency, reliability, and scalability. We then describe the implementation of a software scheduler which aims to improve input and output performance of a targeted collection of informatics tools, as well as the profiling and optimization needed to tune the software. We evaluate the performance of the software with a scalability study of the Bioinformatics tools BLAST, HMMER, and MUSCLE; as well as the Cheminformatics tool DOCK6
    • 

    corecore