853 research outputs found

    Recent Developments in Nonregular Fractional Factorial Designs

    Full text link
    Nonregular fractional factorial designs such as Plackett-Burman designs and other orthogonal arrays are widely used in various screening experiments for their run size economy and flexibility. The traditional analysis focuses on main effects only. Hamada and Wu (1992) went beyond the traditional approach and proposed an analysis strategy to demonstrate that some interactions could be entertained and estimated beyond a few significant main effects. Their groundbreaking work stimulated much of the recent developments in design criterion creation, construction and analysis of nonregular designs. This paper reviews important developments in optimality criteria and comparison, including projection properties, generalized resolution, various generalized minimum aberration criteria, optimality results, construction methods and analysis strategies for nonregular designs.Comment: Submitted to the Statistics Surveys (http://www.i-journals.org/ss/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Intelligent Processing in Wireless Communications Using Particle Swarm Based Methods

    Get PDF
    There are a lot of optimization needs in the research and design of wireless communica- tion systems. Many of these optimization problems are Nondeterministic Polynomial (NP) hard problems and could not be solved well. Many of other non-NP-hard optimization problems are combinatorial and do not have satisfying solutions either. This dissertation presents a series of Particle Swarm Optimization (PSO) based search and optimization algorithms that solve open research and design problems in wireless communications. These problems are either avoided or solved approximately before. PSO is a bottom-up approach for optimization problems. It imposes no conditions on the underlying problem. Its simple formulation makes it easy to implement, apply, extend and hybridize. The algorithm uses simple operators like adders, and multipliers to travel through the search space and the process requires just five simple steps. PSO is also easy to control because it has limited number of parameters and is less sensitive to parameters than other swarm intelligence algorithms. It is not dependent on initial points and converges very fast. Four types of PSO based approaches are proposed targeting four different kinds of problems in wireless communications. First, we use binary PSO and continuous PSO together to find optimal compositions of Gaussian derivative pulses to form several UWB pulses that not only comply with the FCC spectrum mask, but also best exploit the avail- able spectrum and power. Second, three different PSO based algorithms are developed to solve the NLOS/LOS channel differentiation, NLOS range error mitigation and multilateration problems respectively. Third, a PSO based search method is proposed to find optimal orthogonal code sets to reduce the inter carrier interference effects in an frequency redundant OFDM system. Fourth, a PSO based phase optimization technique is proposed in reducing the PAPR of an frequency redundant OFDM system. The PSO based approaches are compared with other canonical solutions for these communication problems and showed superior performance in many aspects. which are confirmed by analysis and simulation results provided respectively. Open questions and future Open questions and future works for the dissertation are proposed to serve as a guide for the future research efforts

    NASA Tech Briefs, September 2006

    Get PDF
    Topics covered include: Improving Thermomechanical Properties of SiC/SiC Composites; Aerogel/Particle Composites for Thermoelectric Devices; Patches for Repairing Ceramics and Ceramic- Matrix Composites; Lower-Conductivity Ceramic Materials for Thermal-Barrier Coatings; An Alternative for Emergency Preemption of Traffic Lights; Vehicle Transponder for Preemption of Traffic Lights; Automated Announcements of Approaching Emergency Vehicles; Intersection Monitor for Traffic-Light-Preemption System; Full-Duplex Digital Communication on a Single Laser Beam; Stabilizing Microwave Frequency of a Photonic Oscillator; Microwave Oscillators Based on Nonlinear WGM Resonators; Pointing Reference Scheme for Free-Space Optical Communications Systems; High-Level Performance Modeling of SAR Systems; Spectral Analysis Tool 6.2 for Windows; Multi-Platform Avionics Simulator; Silicon-Based Optical Modulator with Ferroelectric Layer; Multiplexing Transducers Based on Tunnel-Diode Oscillators; Scheduling with Automated Resolution of Conflicts; Symbolic Constraint Maintenance Grid; Discerning Trends in Performance Across Multiple Events; Magnetic Field Solver; Computing for Aiming a Spaceborne Bistatic- Radar Transmitter; 4-Vinyl-1,3-Dioxolane-2-One as an Additive for Li-Ion Cells; Probabilistic Prediction of Lifetimes of Ceramic Parts; STRANAL-PMC Version 2.0; Micromechanics and Piezo Enhancements of HyperSizer; Single-Phase Rare-Earth Oxide/Aluminum Oxide Glasses; Tilt/Tip/Piston Manipulator with Base-Mounted Actuators; Measurement of Model Noise in a Hard-Wall Wind Tunnel; Loci-STREAM Version 0.9; The Synergistic Engineering Environment; Reconfigurable Software for Controlling Formation Flying; More About the Tetrahedral Unstructured Software System; Computing Flows Using Chimera and Unstructured Grids; Avoiding Obstructions in Aiming a High-Gain Antenna; Analyzing Aeroelastic Stability of a Tilt-Rotor Aircraft; Tracking Positions and Attitudes of Mars Rovers; Stochastic Evolutionary Algorithms for Planning Robot Paths; Compressible Flow Toolbox; Rapid Aeroelastic Analysis of Blade Flutter in Turbomachines; General Flow-Solver Code for Turbomachinery Applications; Code for Multiblock CFD and Heat-Transfer Computations; Rotating-Pump Design Code; Covering a Crucible with Metal Containing Channels; Repairing Fractured Bones by Use of Bioabsorbable Composites; Kalman Filter for Calibrating a Telescope Focal Plane; Electronic Absolute Cartesian Autocollimator; Fiber-Optic Gratings for Lidar Measurements of Water Vapor; Simulating Responses of Gravitational-Wave Instrumentation; SOFTC: A Software Correlator for VLBI; Progress in Computational Simulation of Earthquakes; Database of Properties of Meteors; Computing Spacecraft Solar-Cell Damage by Charged Particles; Thermal Model of a Current-Carrying Wire in a Vacuum; Program for Analyzing Flows in a Complex Network; Program Predicts Performance of Optical Parametric Oscillators; Processing TES Level-1B Data; Automated Camera Calibration; Tracking the Martian CO2 Polar Ice Caps in Infrared Images; Processing TES Level-2 Data; SmaggIce Version 1.8; Solving the Swath Segment Selection Problem; The Spatial Standard Observer; Less-Complex Method of Classifying MPSK; Improvement in Recursive Hierarchical Segmentation of Data; Using Heaps in Recursive Hierarchical Segmentation of Data; Tool for Statistical Analysis and Display of Landing Sites; Automated Assignment of Proposals to Reviewers; Array-Pattern-Match Compiler for Opportunistic Data Analysis; Pre-Processor for Compression of Multispectral Image Data; Compressing Image Data While Limiting the Effects of Data Losses; Flight Operations Analysis Tool; Improvement in Visual Target Tracking for a Mobile Robot; Software for Simulating Air Traffic; Automated Vectorization of Decision-Based Algorithms; Grayscale Optical Correlator Workbench; "One-Stop Shopping" for Ocean Remote-Sensing and Model Data; State Analysis Database Tool; Generating CAHV and CAHVOmages with Shadows in ROAMS; Improving UDP/IP Transmission Without Increasing Congestion; FORTRAN Versions of Reformulated HFGMC Codes; Program for Editing Spacecraft Command Sequences; Flight-Tested Prototype of BEAM Software; Mission Scenario Development Workbench; Marsviewer; Tool for Analysis and Reduction of Scientific Data; ASPEN Version 3.0; Secure Display of Space-Exploration Images; Digital Front End for Wide-Band VLBI Science Receiver; Multifunctional Tanks for Spacecraft; Lightweight, Segmented, Mostly Silicon Telescope Mirror; Assistant for Analyzing Tropical-Rain-Mapping Radar Data; and Anion-Intercalating Cathodes for High-Energy- Density Cells

    Benchmarking environmental machine-learning models: methodological progress and an application to forest health

    Get PDF
    Geospatial machine learning is a versatile approach to analyze environmental data and can help to better understand the interactions and current state of our environment. Due to the artificial intelligence of these algorithms, complex relationships can possibly be discovered which might be missed by other analysis methods. Modeling the interaction of creatures with their environment is referred to as ecological modeling, which is a subcategory of environmental modeling. A subfield of ecological modeling is SDM, which aims to understand the relation between the presence or absence of certain species in their environments. SDM is different from classical mapping/detection analysis. While the latter primarily aim for a visual representation of a species spatial distribution, the former focuses on using the available data to build models and interpreting these. Because no single best option exists to build such models, different settings need to be evaluated and compared against each other. When conducting such modeling comparisons, which are commonly referred to as benchmarking, care needs to be taken throughout the analysis steps to achieve meaningful and unbiased results. These steps are composed out of data preprocessing, model optimization and performance assessment. While these general principles apply to any modeling analysis, their application in an environmental context often requires additional care with respect to data handling, possibly hidden underlying data effects and model selection. To conduct all in a programmatic (and efficient) way, toolboxes in the form of programming modules or packages are needed. This work makes methodological contributions which focus on efficient, machine-learning based analysis of environmental data. In addition, research software to generalize and simplify the described process has been created throughout this work

    The Optimization of Geotechnical Site Investigations for Pile Design in Multiple Layer Soil Profiles Using a Risk-Based Approach

    Get PDF
    The testing of subsurface material properties, i.e. a geotechnical site investigation, is a crucial part of projects that are located on or within the ground. The process consists of testing samples at a variety of locations, in order to model the performance of an engineering system for design processes. Should these models be inaccurate or unconservative due to an improper investigation, there is considerable risk of consequences such as structural collapse, construction delays, litigation, and over-design. However, despite these risks, there are relatively few quantitative guidelines or research items on informing an explicit, optimal investigation for a given foundation and soil profile. This is detrimental, as testing scope is often minimised in an attempt to reduce expenditure, thereby increasing the aforementioned risks. This research recommends optimal site investigations for multi-storey buildings supported by pile foundations, for a variety of structural configurations and soil profiles. The recommendations include that of the optimal test type, number of tests, testing locations, and interpretation of test data. The framework consists of a risk-based approach, where an investigation is considered optimal if it results in the lowest total project cost, incorporating both the cost of testing, and that associated with any expected negative consequences. The analysis is statistical in nature, employing Monte Carlo simulation and the use of randomly generated virtual soils through random field theory, as well as finite element analysis for pile assessment. A number of innovations have been developed to assist the novel nature of the work. For example, a new method of producing randomly generated multiple-layer soils has been devised. This work is the first instance of site investigations being optimised in multiple-layer soils, which are considerably more complex than the single-layer soils examined previously. Furthermore, both the framework and the numerical tools have been themselves extensively optimised for speed. Efficiency innovations include modifying the analysis to produce re-usable pile settlement curves, as opposed to designing and assessing the piles directly. This both reduces the amount of analysis required and allows for flexible post-processing for different conditions. Other optimizations include the elimination of computationally expensive finite element analysis from within the Monte Carlo simulations, and additional minor improvements. Practicing engineers can optimise their site investigations through three outcomes of this research. Firstly, optimal site investigation scopes are known for the numerous specific cases examined throughout this document, and the resulting inferred recommendations. Secondly, a rule-of-thumb guideline has been produced, suggesting the optimal number of tests for buildings of all sizes in a single soil case of intermediate variability. Thirdly, a highly efficient and versatile software tool, SIOPS, has been produced, allowing engineers to run a simplified version of the analysis for custom soils and buildings. The tool can do almost all the analysis shown throughout the thesis, including the use of a genetic algorithm to optimise testing locations. However, it is approximately 10 million times faster than analysis using the original framework, running on a single-core computer within minutes.Thesis (Ph.D.) -- University of Adelaide, School of Civil, Environmental and Mining Engineering, 202

    Uses and applications of artificial intelligence in manufacturing

    Get PDF
    The purpose of the THESIS is to provide engineers and personnels with a overview of the concepts that underline Artificial Intelligence and Expert Systems. Artificial Intelligence is concerned with the developments of theories and techniques required to provide a computational engine with the abilities to perceive, think and act, in an intelligent manner in a complex environment. Expert system is branch of Artificial Intelligence where the methods of reasoning emulate those of human experts. Artificial Intelligence derives it\u27s power from its ability to represent complex forms of knowledge, some of it common sense, heuristic and symbolic, and the ability to apply the knowledge in searching for solutions. The Thesis will review : The components of an intelligent system, The basics of knowledge representation, Search based problem solving methods, Expert system technologies, Uses and applications of AI in various manufacturing areas like Design, Process Planning, Production Management, Energy Management, Quality Assurance, Manufacturing Simulation, Robotics, Machine Vision etc. Prime objectives of the Thesis are to understand the basic concepts underlying Artificial Intelligence and be able to identify where the technology may be applied in the field of Manufacturing Engineering

    Research and Technology Report. Goddard Space Flight Center

    Get PDF
    This issue of Goddard Space Flight Center's annual report highlights the importance of mission operations and data systems covering mission planning and operations; TDRSS, positioning systems, and orbit determination; ground system and networks, hardware and software; data processing and analysis; and World Wide Web use. The report also includes flight projects, space sciences, Earth system science, and engineering and materials
    corecore