25 research outputs found

    A bibliography on formal methods for system specification, design and validation

    Get PDF
    Literature on the specification, design, verification, testing, and evaluation of avionics systems was surveyed, providing 655 citations. Journal papers, conference papers, and technical reports are included. Manual and computer-based methods were employed. Keywords used in the online search are listed

    Proceedings, MSVSCC 2011

    Get PDF
    Proceedings of the 5th Annual Modeling, Simulation & Visualization Student Capstone Conference held on April 14, 2011 at VMASC in Suffolk, Virginia. 186 pp

    Progress Report No. 3

    Get PDF
    Progress report of the Biomedical Computer Laboratory, covering period 1 July 1966 to 30 June 1967

    Threshold elements and the design of sequential switching networks

    Get PDF
    Includes bibliographies."AD 657370."[by] A.K. Susskind, D.R. Haring [and] C.L. Liu

    A computer-aided design for digital filter implementation

    Get PDF
    Imperial Users onl

    Knowledge Discovery and Monotonicity

    Get PDF
    The monotonicity property is ubiquitous in our lives and it appears in different roles: as domain knowledge, as a requirement, as a property that reduces the complexity of the problem, and so on. It is present in various domains: economics, mathematics, languages, operations research and many others. This thesis is focused on the monotonicity property in knowledge discovery and more specifically in classification, attribute reduction, function decomposition, frequent patterns generation and missing values handling. Four specific problems are addressed within four different methodologies, namely, rough sets theory, monotone decision trees, function decomposition and frequent patterns generation. In the first three parts, the monotonicity is domain knowledge and a requirement for the outcome of the classification process. The three methodologies are extended for dealing with monotone data in order to be able to guarantee that the outcome will also satisfy the monotonicity requirement. In the last part, monotonicity is a property that helps reduce the computation of the process of frequent patterns generation. Here the focus is on two of the best algorithms and their comparison both theoretically and experimentally. About the Author: Viara Popova was born in Bourgas, Bulgaria in 1972. She followed her secondary education at Mathematics High School "Nikola Obreshkov" in Bourgas. In 1996 she finished her higher education at Sofia University, Faculty of Mathematics and Informatics where she graduated with major in Informatics and specialization in Information Technologies in Education. She then joined the Department of Information Technologies, First as an associated member and from 1997 as an assistant professor. In 1999 she became a PhD student at Erasmus University Rotterdam, Faculty of Economics, Department of Computer Science. In 2004 she joined the Artificial Intelligence Group within the Department of Computer Science, Faculty of Sciences at Vrije Universiteit Amsterdam as a PostDoc researcher.This thesis is positioned in the area of knowledge discovery with special attention to problems where the property of monotonicity plays an important role. Monotonicity is a ubiquitous property in all areas of life and has therefore been widely studied in mathematics. Monotonicity in knowledge discovery can be treated as available background information that can facilitate and guide the knowledge extraction process. While in some sub-areas methods have already been developed for taking this additional information into account, in most methodologies it has not been extensively studied or even has not been addressed at all. This thesis is a contribution to a change in that direction. In the thesis, four specific problems have been examined from different sub-areas of knowledge discovery: the rough sets methodology, monotone decision trees, function decomposition and frequent patterns discovery. In the first three parts, the monotonicity is domain knowledge and a requirement for the outcome of the classification process. The three methodologies are extended for dealing with monotone data in order to be able to guarantee that the outcome will also satisfy the monotonicity requirement. In the last part, monotonicity is a property that helps reduce the computation of the process of frequent patterns generation. Here the focus is on two of the best algorithms and their comparison both theoretically and experimentally

    The 1974 NASA-ASEE summer faculty fellowship aeronautics and space research program

    Get PDF
    Research activities by participants in the fellowship program are documented, and include such topics as: (1) multispectral imagery for detecting southern pine beetle infestations; (2) trajectory optimization techniques for low thrust vehicles; (3) concentration characteristics of a fresnel solar strip reflection concentrator; (4) calaboration and reduction of video camera data; (5) fracture mechanics of Cer-Vit glass-ceramic; (6) space shuttle external propellant tank prelaunch heat transfer; (7) holographic interferometric fringes; and (8) atmospheric wind and stress profiles in a two-dimensional internal boundary layer

    Phased mission analysis using the causeā€“consequence diagram method

    Get PDF
    Most reliability analysis techniques and tools assume that a system used for a mission consists of a single phase. However, multiple phases are natural in many missions. A system that can be modelled as a mission consisting of a sequence of phases is called a phased mission system. In this case, for successful completion of each phase the system may have to meet different requirements. System failure during any phase will result in mission failure. Fault tree analysis, binary decision diagrams and Markov techniques have been used to model phased missions. The causeā€“consequence diagram method is an alternative technique capable of modelling all system outcomes (success and failure) in one logic diagram. [Continues.

    Global Optimisation of Multiā€Camera Moving Object Detection

    Get PDF
    An important task in intelligent video surveillance is to detect multiple pedestrians. These pedestrians may be occluded by each other in a camera view. To overcome this problem, multiple cameras can be deployed to provide complementary information, and homography mapping has been widely used for the association and fusion of multiā€camera observations. The intersection regions of the foreground projections usually indicate the locations of moving objects. However, many false positives may be generated from the intersections of nonā€corresponding foreground regions. In this thesis, an algorithm for multiā€camera pedestrian detection is proposed. The first stage of this work is to propose pedestrian candidate locations on the top view. Two approaches are proposed in this stage. The first approach is a topā€down approach which is based on the probabilistic occupancy map framework. The ground plane is discretized into a grid, and the likelihood of pedestrian presence at each location is estimated by comparing a rectangle, of the average size of the pedestrians standing there, with the foreground silhouettes in all camera views. The second approach is a bottomā€up approach, which is based on the multiā€plane homography mapping. The foreground regions in all camera views are projected and overlaid in the top view according to the multiā€plane homographies and the potential locations of pedestrians are estimated from the intersection regions. In the second stage, where we borrowed the idea from the Quineā€McCluskey (QM) method for logic function minimisation, essential candidates are initially identified, each of which covers at least a significant part of the foreground that is not covered by the other candidates. Then nonā€essential candidates are selected to cover the remaining foregrounds by following a repeated process, which alternates between merging redundant candidates and finding emerging essential candidates. Then, an alternative approach to the QM method, the Petrickā€™s method, is used for finding the minimum set of pedestrian candidates to cover all the foreground regions. These two methods are nonā€iterative and can greatly increase the computational speed. No similar work has been proposed before. Experiments on benchmark video datasets have demonstrated the good performance of the proposed algorithm in comparison with other stateā€ofā€theā€art methods for pedestrian detection
    corecore