5,823 research outputs found

    Measuring Kinematic Variables in Front Crawl Swimming Using Accelerometers: A Validation Study

    Get PDF
    Objective data on swimming performance is needed to meet the demands of the swimming coach and athlete. The purpose of this study is to use a multiple inertial measurement units to calculate Lap Time, Velocity, Stroke Count, Stroke Duration, Stroke Rate and Phases of the Stroke (Entry, Pull, Push, Recovery) in front crawl swimming. Using multiple units on the body, an algorithm was developed to calculate the phases of the stroke based on the relative position of the body roll. Twelve swimmers, equipped with these devices on the body, performed fatiguing trials. The calculated factors were compared to the same data derived to video data showing strong positive results for all factors. Four swimmers required individual adaptation to the stroke phase calculation method. The developed algorithm was developed using a search window relative to the body roll (peak/trough). This customization requirement demonstrates that single based devices will not be able to determine these phases of the stroke with sufficient accuracy

    Enhancing pharmaceutical packaging through a technology ecosystem to facilitate the reuse of medicines and reduce medicinal waste

    Get PDF
    The idea of reusing dispensed medicines is appealing to the general public provided its benefits are illustrated, its risks minimized, and the logistics resolved. For example, medicine reuse could help reduce medicinal waste, protect the environment and improve public health. However, the associated technologies and legislation facilitating medicine reuse are generally not available. The availability of suitable technologies could arguably help shape stakeholders’ beliefs and in turn, uptake of a future medicine reuse scheme by tackling the risks and facilitating the practicalities. A literature survey is undertaken to lay down the groundwork for implementing technologies on and around pharmaceutical packaging in order to meet stakeholders’ previously expressed misgivings about medicine reuse (’stakeholder requirements’), and propose a novel ecosystem for, in effect, reusing returned medicines. Methods: A structured literature search examining the application of existing technologies on pharmaceutical packaging to enable medicine reuse was conducted and presented as a narrative review. Results: Reviewed technologies are classified according to different stakeholders’ requirements, and a novel ecosystem from a technology perspective is suggested as a solution to reusing medicines. Conclusion: Active sensing technologies applying to pharmaceutical packaging using printed electronics enlist medicines to be part of the Internet of Things network. Validating the quality and safety of returned medicines through this network seems to be the most effective way for reusing medicines and the correct application of technologies may be the key enabler

    A Field Guide to Genetic Programming

    Get PDF
    xiv, 233 p. : il. ; 23 cm.Libro ElectrónicoA Field Guide to Genetic Programming (ISBN 978-1-4092-0073-4) is an introduction to genetic programming (GP). GP is a systematic, domain-independent method for getting computers to solve problems automatically starting from a high-level statement of what needs to be done. Using ideas from natural evolution, GP starts from an ooze of random computer programs, and progressively refines them through processes of mutation and sexual recombination, until solutions emerge. All this without the user having to know or specify the form or structure of solutions in advance. GP has generated a plethora of human-competitive results and applications, including novel scientific discoveries and patentable inventions. The authorsIntroduction -- Representation, initialisation and operators in Tree-based GP -- Getting ready to run genetic programming -- Example genetic programming run -- Alternative initialisations and operators in Tree-based GP -- Modular, grammatical and developmental Tree-based GP -- Linear and graph genetic programming -- Probalistic genetic programming -- Multi-objective genetic programming -- Fast and distributed genetic programming -- GP theory and its applications -- Applications -- Troubleshooting GP -- Conclusions.Contents xi 1 Introduction 1.1 Genetic Programming in a Nutshell 1.2 Getting Started 1.3 Prerequisites 1.4 Overview of this Field Guide I Basics 2 Representation, Initialisation and GP 2.1 Representation 2.2 Initialising the Population 2.3 Selection 2.4 Recombination and Mutation Operators in Tree-based 3 Getting Ready to Run Genetic Programming 19 3.1 Step 1: Terminal Set 19 3.2 Step 2: Function Set 20 3.2.1 Closure 21 3.2.2 Sufficiency 23 3.2.3 Evolving Structures other than Programs 23 3.3 Step 3: Fitness Function 24 3.4 Step 4: GP Parameters 26 3.5 Step 5: Termination and solution designation 27 4 Example Genetic Programming Run 4.1 Preparatory Steps 29 4.2 Step-by-Step Sample Run 31 4.2.1 Initialisation 31 4.2.2 Fitness Evaluation Selection, Crossover and Mutation Termination and Solution Designation Advanced Genetic Programming 5 Alternative Initialisations and Operators in 5.1 Constructing the Initial Population 5.1.1 Uniform Initialisation 5.1.2 Initialisation may Affect Bloat 5.1.3 Seeding 5.2 GP Mutation 5.2.1 Is Mutation Necessary? 5.2.2 Mutation Cookbook 5.3 GP Crossover 5.4 Other Techniques 32 5.5 Tree-based GP 39 6 Modular, Grammatical and Developmental Tree-based GP 47 6.1 Evolving Modular and Hierarchical Structures 47 6.1.1 Automatically Defined Functions 48 6.1.2 Program Architecture and Architecture-Altering 50 6.2 Constraining Structures 51 6.2.1 Enforcing Particular Structures 52 6.2.2 Strongly Typed GP 52 6.2.3 Grammar-based Constraints 53 6.2.4 Constraints and Bias 55 6.3 Developmental Genetic Programming 57 6.4 Strongly Typed Autoconstructive GP with PushGP 59 7 Linear and Graph Genetic Programming 61 7.1 Linear Genetic Programming 61 7.1.1 Motivations 61 7.1.2 Linear GP Representations 62 7.1.3 Linear GP Operators 64 7.2 Graph-Based Genetic Programming 65 7.2.1 Parallel Distributed GP (PDGP) 65 7.2.2 PADO 67 7.2.3 Cartesian GP 67 7.2.4 Evolving Parallel Programs using Indirect Encodings 68 8 Probabilistic Genetic Programming 8.1 Estimation of Distribution Algorithms 69 8.2 Pure EDA GP 71 8.3 Mixing Grammars and Probabilities 74 9 Multi-objective Genetic Programming 75 9.1 Combining Multiple Objectives into a Scalar Fitness Function 75 9.2 Keeping the Objectives Separate 76 9.2.1 Multi-objective Bloat and Complexity Control 77 9.2.2 Other Objectives 78 9.2.3 Non-Pareto Criteria 80 9.3 Multiple Objectives via Dynamic and Staged Fitness Functions 80 9.4 Multi-objective Optimisation via Operator Bias 81 10 Fast and Distributed Genetic Programming 83 10.1 Reducing Fitness Evaluations/Increasing their Effectiveness 83 10.2 Reducing Cost of Fitness with Caches 86 10.3 Parallel and Distributed GP are Not Equivalent 88 10.4 Running GP on Parallel Hardware 89 10.4.1 Master–slave GP 89 10.4.2 GP Running on GPUs 90 10.4.3 GP on FPGAs 92 10.4.4 Sub-machine-code GP 93 10.5 Geographically Distributed GP 93 11 GP Theory and its Applications 97 11.1 Mathematical Models 98 11.2 Search Spaces 99 11.3 Bloat 101 11.3.1 Bloat in Theory 101 11.3.2 Bloat Control in Practice 104 III Practical Genetic Programming 12 Applications 12.1 Where GP has Done Well 12.2 Curve Fitting, Data Modelling and Symbolic Regression 12.3 Human Competitive Results – the Humies 12.4 Image and Signal Processing 12.5 Financial Trading, Time Series, and Economic Modelling 12.6 Industrial Process Control 12.7 Medicine, Biology and Bioinformatics 12.8 GP to Create Searchers and Solvers – Hyper-heuristics xiii 12.9 Entertainment and Computer Games 127 12.10The Arts 127 12.11Compression 128 13 Troubleshooting GP 13.1 Is there a Bug in the Code? 13.2 Can you Trust your Results? 13.3 There are No Silver Bullets 13.4 Small Changes can have Big Effects 13.5 Big Changes can have No Effect 13.6 Study your Populations 13.7 Encourage Diversity 13.8 Embrace Approximation 13.9 Control Bloat 13.10 Checkpoint Results 13.11 Report Well 13.12 Convince your Customers 14 Conclusions Tricks of the Trade A Resources A.1 Key Books A.2 Key Journals A.3 Key International Meetings A.4 GP Implementations A.5 On-Line Resources 145 B TinyGP 151 B.1 Overview of TinyGP 151 B.2 Input Data Files for TinyGP 153 B.3 Source Code 154 B.4 Compiling and Running TinyGP 162 Bibliography 167 Inde

    Gradients in urban material composition: A new concept to map cities with spaceborne imaging spectroscopy data

    Get PDF
    To understand processes in urban environments, such as urban energy fluxes or surface temperature patterns, it is important to map urban surface materials. Airborne imaging spectroscopy data have been successfully used to identify urban surface materials mainly based on unmixing algorithms. Upcoming spaceborne Imaging Spectrometers (IS), such as the Environmental Mapping and Analysis Program (EnMAP), will reduce the time and cost-critical limitations of airborne systems for Earth Observation (EO). However, the spatial resolution of all operated and planned IS in space will not be higher than 20 to 30 m and, thus, the detection of pure Endmember (EM) candidates in urban areas, a requirement for spectral unmixing, is very limited. Gradient analysis could be an alternative method for retrieving urban surface material compositions in pixels from spaceborne IS. The gradient concept is well known in ecology to identify plant species assemblages formed by similar environmental conditions but has never been tested for urban materials. However, urban areas also contain neighbourhoods with similar physical, compositional and structural characteristics. Based on this assumption, this study investigated (1) whether cover fractions of surface materials change gradually in urban areas and (2) whether these gradients can be adequately mapped and interpreted using imaging spectroscopy data (e.g. EnMAP) with 30 m spatial resolution. Similarities of material compositions were analysed on the basis of 153 systematically distributed samples on a detailed surface material map using Detrended Correspondence Analysis (DCA). Determined gradient scores for the first two gradients were regressed against the corresponding mean reflectance of simulated EnMAP spectra using Partial Least Square regression models. Results show strong correlations with R2 = 0.85 and R2 = 0.71 and an RMSE of 0.24 and 0.21 for the first and second axis, respectively. The subsequent mapping of the first gradient reveals patterns that correspond to the transition from predominantly vegetation classes to the dominance of artificial materials. Patterns resulting from the second gradient are associated with surface material compositions that are related to finer structural differences in urban structures. The composite gradient map shows patterns of common surface material compositions that can be related to urban land use classes such as Urban Structure Types (UST). By linking the knowledge of typical material compositions with urban structures, gradient analysis seems to be a powerful tool to map characteristic material compositions in 30 m imaging spectroscopy data of urban areas

    FY 1989 scientific and technical reports, articles, papers, and presentations

    Get PDF
    A compendium of bibliographic references to papers presented by Marshall Space Flight Center (MSFC) personnel and contractors during FY 1989 is provided. The papers include formal NASA technical reports, memoranda, papers which were published in technical journals, and presentations by MSFC personnel. The formal NASA technical reports and memoranda have abstracts included. Sources for obtaining these documents are also included

    Cognitively-Engineered Multisensor Data Fusion Systems for Military Applications

    Get PDF
    The fusion of imagery from multiple sensors is a field of research that has been gaining prominence in the scientific community in recent years. The technical aspects of combining multisensory information have been and are currently being studied extensively. However, the cognitive aspects of multisensor data fusion have not received so much attention. Prior research in the field of cognitive engineering has shown that the cognitive aspects of any human-machine system should be taken into consideration in order to achieve systems that are both safe and useful. The goal of this research was to model how humans interpret multisensory data, and to evaluate the value of a cognitively-engineered multisensory data fusion system as an effective, time-saving means of presenting information in high- stress situations. Specifically, this research used principles from cognitive engineering to design, implement, and evaluate a multisensor data fusion system for pilots in high-stress situations. Two preliminary studies were performed, and concurrent protocol analysis was conducted to determine how humans interpret and mentally fuse information from multiple sensors in both low- and high-stress environments. This information was used to develop a model for human processing of information from multiple data sources. This model was then implemented in the development of algorithms for fusing imagery from several disparate sensors (visible and infrared). The model and the system as a whole were empirically evaluated in an experiment with fighter pilots in a simulated combat environment. The results show that the model is an accurate depiction of how humans interpret information from multiple disparate sensors, and that the algorithms show promise for assisting fighter pilots in quicker and more accurate target identification

    Hydrogen Sulfide Flux Measurements And Dispersion Modeling From Constr

    Get PDF
    Odor problems are a common complaint from residents living near landfills. Many compounds can cause malodorous conditions. However, hydrogen sulfide (h2s) has been identified as a principal odorous component from construction and demolition (c&d)debris landfills. Although several studies have reported the ambient concentrations of h2s near c&d landfills, few studies have quantified emission rates of h2s. The most widely used and proven technique for measuring gas emission rates from landfills is the flux chamber method. Typically the flux chamber is a cylindrical enclosure device with a spherical top which limits the gas emission area. Pure zero grade air is introduced into the chamber, allowed to mix with emitting gases captured from the landfill surface, and then transported to the exit port where concentrations can be measured. Flux measurements using the flux chamber were performed at five different c&d landfills from june to august, 2003. The flux rates of h2s measured in this research were three to six orders of magnitude lower than the flux rates of methane reported in the literature. In addition to the h2s flux measurements, dispersion modeling was conducted, using the epa dispersion model, industrial source complex short term (iscst3), in order to evaluate impacts on landfill workers and communities around the landfills. The modeling results were analyzed to estimate the potential ground level maximum h2s concentrations for 1-hr and 3-min periods and the frequency (occurrences per year) above the h2s odor detection threshold for each landfill. Odor complaints could be expected from four among five landfills selected for this study, based on 0.5-ppb odor detection threshold

    Coalition Battle Management Language (C-BML) Study Group Final Report

    Get PDF
    Interoperability across Modeling and Simulation (M&S) and Command and Control (C2) systems continues to be a significant problem for today\u27s warfighters. M&S is well-established in military training, but it can be a valuable asset for planning and mission rehearsal if M&S and C2 systems were able to exchange information, plans, and orders more effectively. To better support the warfighter with M&S based capabilities, an open standards-based framework is needed that establishes operational and technical coherence between C2 and M&S systems

    Annual Report Of Research and Creative Productions by Faculty and Staff from January to December, 2004.

    Get PDF
    Annual Report Of Research and Creative Productions by Faculty and Staff from January to December, 2004

    Speech-driven Animation with Meaningful Behaviors

    Full text link
    Conversational agents (CAs) play an important role in human computer interaction. Creating believable movements for CAs is challenging, since the movements have to be meaningful and natural, reflecting the coupling between gestures and speech. Studies in the past have mainly relied on rule-based or data-driven approaches. Rule-based methods focus on creating meaningful behaviors conveying the underlying message, but the gestures cannot be easily synchronized with speech. Data-driven approaches, especially speech-driven models, can capture the relationship between speech and gestures. However, they create behaviors disregarding the meaning of the message. This study proposes to bridge the gap between these two approaches overcoming their limitations. The approach builds a dynamic Bayesian network (DBN), where a discrete variable is added to constrain the behaviors on the underlying constraint. The study implements and evaluates the approach with two constraints: discourse functions and prototypical behaviors. By constraining on the discourse functions (e.g., questions), the model learns the characteristic behaviors associated with a given discourse class learning the rules from the data. By constraining on prototypical behaviors (e.g., head nods), the approach can be embedded in a rule-based system as a behavior realizer creating trajectories that are timely synchronized with speech. The study proposes a DBN structure and a training approach that (1) models the cause-effect relationship between the constraint and the gestures, (2) initializes the state configuration models increasing the range of the generated behaviors, and (3) captures the differences in the behaviors across constraints by enforcing sparse transitions between shared and exclusive states per constraint. Objective and subjective evaluations demonstrate the benefits of the proposed approach over an unconstrained model.Comment: 13 pages, 12 figures, 5 table
    • …
    corecore