2,406 research outputs found

    Using Volunteer Tracking Information for Activity-Based Travel Demand Modeling and Finding Dynamic Interaction-Based Joint-Activity Opportunities

    Get PDF
    Technology used for real-time locating is being used to identify and track the movements of individuals in real time. With the increased use of mobile technology by individuals, we are now able to explore more potential interactions between people and their living environment using real-time tracking and communication technologies. One of the potentials that has hardly been taken advantage of is to use cell phone tracking information for activity-based transportation study. Using GPS-embedded smart phones, it is convenient to continuously record our trajectories in a day with little information loss. As smart phones get cheaper and hence attract more users, the potential information source for self-tracking data is pervasive. This study provides a cell phone plus web method that collects volunteer cell phone tracking data and uses an algorithm to identify the allocation of activities and traveling in space and time. It also provides a step that incorporates user-participated prompted recall attribute identification (travel modes and activity types) which supplements the data preparation for activity-based travel demand modeling. Besides volunteered geospatial information collection, cell phone users’ real-time locations are often collected by service providers such as Apple, AT&T and many other third-party companies. This location data has been used in turn to boost new location-based services. However, few applications have been seen to address dynamic human interactions and spatio-temporal constraints of activities. This study sets up a framework for a new kind of location-based service that finds joint-activity opportunities for multiple individuals, and demonstrates its feasibility using a spatio-temporal GIS approach

    Design study of general aviation collision avoidance system

    Get PDF
    The selection and design of a time/frequency collision avoidance system for use in general aviation aircraft is discussed. The modifications to airline transport collision avoidance equipment which were made to produce the simpler general aviation system are described. The threat determination capabilities and operating principles of the general aviation system are illustrated

    Fault-tolerant computer study

    Get PDF
    A set of building block circuits is described which can be used with commercially available microprocessors and memories to implement fault tolerant distributed computer systems. Each building block circuit is intended for VLSI implementation as a single chip. Several building blocks and associated processor and memory chips form a self checking computer module with self contained input output and interfaces to redundant communications buses. Fault tolerance is achieved by connecting self checking computer modules into a redundant network in which backup buses and computer modules are provided to circumvent failures. The requirements and design methodology which led to the definition of the building block circuits are discussed

    Data collection procedures for the Software Engineering Laboratory (SEL) database

    Get PDF
    This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms

    Formal functional testing of graphical user interfaces.

    Get PDF
    SIGLEAvailable from British Library Document Supply Centre- DSC:DX177960 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    Second CLIPS Conference Proceedings, volume 1

    Get PDF
    Topics covered at the 2nd CLIPS Conference held at the Johnson Space Center, September 23-25, 1991 are given. Topics include rule groupings, fault detection using expert systems, decision making using expert systems, knowledge representation, computer aided design and debugging expert systems

    직접 볼륨 렌더링의 전이 함수 설계에 관한 연구

    Get PDF
    학위논문 (박사)-- 서울대학교 대학원 : 전기·컴퓨터공학부, 2017. 2. 신영길.Although direct volume rendering (DVR) has become a commodity, the design of transfer functions still a challenge. Transfer functions which map data values to optical properties (i.e., colors and opacities) highlight features of interests as well as hide unimportant regions, dramatically impacting on the quality of the visualization. Therefore, for the effective rendering of interesting features, the design of transfer functions is very important and challenging task. Furthermore, manipulation of these transfer functions is tedious and time-consuming task. In this paper, we propose a 3D spatial field for accurately identifying and visually distinguishing interesting features as well as a mechanism for data exploration using multi-dimensional transfer function. First, we introduce a 3D spatial field for the effective visualization of constricted tubular structures, called as a stenosis map which stores the degree of constriction at each voxel. Constrictions within tubular structures are quantified by using newly proposed measures (i.e., line similarity measure and constriction measure) based on the localized structure analysis, and classified with a proposed transfer function mapping the degree of constriction to color and opacity. We show the application results of our method to the visualization of coronary artery stenoses. We present performance evaluations using twenty-eight clinical datasets, demonstrating high accuracy and efficacy of our proposed method. Second, we propose a new multi-dimensional transfer function which incorporates texture features calculated from statistically homogeneous regions. This approach employs parallel coordinates to provide an intuitive interface for exploring a new multi-dimensional transfer function space. Three specific ways to use a new transfer function based on parallel coordinates enables the effective exploration of large and complex datasets. We present a mechanism for data exploration with a new transfer function space, demonstrating the practical efficacy of our proposed method. Through a study on transfer function design for DVR, we propose two useful approaches. First method to saliently visualize the constrictions within tubular structures and interactively adjust the visual appearance of the constrictions delivers a substantial aid in radiologic practice. Furthermore, second method to classify objects with our intuitive interface utilizing parallel coordinates proves to be a powerful tool for complex data exploration.Chapter 1 Introduction 1 1.1 Background 1 1.1.1 Volume rendering 1 1.1.2 Computer-aided diagnosis 3 1.1.3 Parallel coordinates 5 1.2 Problem statement 8 1.3 Main contribution 12 1.4 Organization of dissertation 16 Chapter 2 Related Work 17 2.1 Transfer function 17 2.1.1 Transfer functions based on spatial characteristics 17 2.1.2 Opacity modulation techniques 20 2.1.3 Multi-dimensional transfer functions 22 2.1.4 Manipulation mechanism for transfer functions 25 2.2 Coronary artery stenosis 28 2.3 Parallel coordinates 32 Chapter 3 Volume Visualization of Constricted Tubular Structures 36 3.1 Overview 36 3.2 Localized structure analysis 37 3.3 Stenosis map 39 3.3.1 Overview 39 3.3.2 Detection of tubular structures 40 3.3.3 Stenosis map computation 49 3.4 Stenosis-based classification 52 3.4.1 Overview 52 3.4.2 Constriction-encoded volume rendering 52 3.4.3 Opacity modulation based on constriction 54 3.5 GPU implementation 57 3.6 Experimental results 59 3.6.1 Clinical data preparation 59 3.6.2 Qualitative evaluation 60 3.6.3 Quantitative evaluation 63 3.6.4 Comparison with previous methods 66 3.6.5 Parameter study 69 Chapter 4 Interactive Multi-Dimensional Transfer Function Using Adaptive Block Based Feature Analysis 73 4.1 Overview 73 4.2 Extraction of statistical features 74 4.3 Extraction of texture features 78 4.4 Multi-dimensional transfer function design using parallel coordinates 81 4.5 Experimental results 86 Chapter 5 Conclusion 90 Bibliography 92 초 록 107Docto

    Statistical strategies for constructing health risk models with multiple pollutants and their interactions: possible choices and comparisons

    Full text link
    Abstract Background As public awareness of consequences of environmental exposures has grown, estimating the adverse health effects due to simultaneous exposure to multiple pollutants is an important topic to explore. The challenges of evaluating the health impacts of environmental factors in a multipollutant model include, but are not limited to: identification of the most critical components of the pollutant mixture, examination of potential interaction effects, and attribution of health effects to individual pollutants in the presence of multicollinearity. Methods In this paper, we reviewed five methods available in the statistical literature that are potentially helpful for constructing multipollutant models. We conducted a simulation study and presented two data examples to assess the performance of these methods on feature selection, effect estimation and interaction identification using both cross-sectional and time-series designs. We also proposed and evaluated a two-step strategy employing an initial screening by a tree-based method followed by further dimension reduction/variable selection by the aforementioned five approaches at the second step. Results Among the five methods, least absolute shrinkage and selection operator regression performs well in general for identifying important exposures, but will yield biased estimates and slightly larger model dimension given many correlated candidate exposures and modest sample size. Bayesian model averaging, and supervised principal component analysis are also useful in variable selection when there is a moderately strong exposure-response association. Substantial improvements on reducing model dimension and identifying important variables have been observed for all the five statistical methods using the two-step modeling strategy when the number of candidate variables is large. Conclusions There is no uniform dominance of one method across all simulation scenarios and all criteria. The performances differ according to the nature of the response variable, the sample size, the number of pollutants involved, and the strength of exposure-response association/interaction. However, the two-step modeling strategy proposed here is potentially applicable under a multipollutant framework with many covariates by taking advantage of both the screening feature of an initial tree-based method and dimension reduction/variable selection property of the subsequent method. The choice of the method should also depend on the goal of the study: risk prediction, effect estimation or screening for important predictors and their interactions.http://deepblue.lib.umich.edu/bitstream/2027.42/112386/1/12940_2013_Article_691.pd
    corecore