4,478 research outputs found

    Design, processing and testing of LSI arrays hybrid microelectronics task

    Get PDF
    Those factors affecting the cost of electronic subsystems utilizing LSI microcircuits were determined and the most efficient methods for low cost packaging of LSI devices as a function of density and reliability were developed

    Digital computer processing of LANDSAT data for North Alabama

    Get PDF
    Computer processing procedures and programs applied to Multispectral Scanner data from LANDSAT are described. The output product produced is a level 1 land use map in conformance with a Universal Transverse Mercator projection. The region studied was a five-county area in north Alabama

    Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Get PDF
    Efficient, reliable data is necessary to make informed decisions on how to best manage aging road assets. This research explores a new method to automate the collection, processing, and analysis of transportation networks using Unmanned Aerial Vehicles and Computer Vision technology. While there are current methodologies to accomplish road assessment manually and semi-autonomously, this research is a proof of concept to obtain the road assessment faster and cheaper with a vision for little to no human interaction required. This research evaluates the strengths of applying UAV technology to pavement assessments and identifies where further work is needed. Furthermore, it validates using UAVs as a viable way forward for collecting pavement information to aid asset managers in sustaining aging road assets. The system was able to capture road photos suitable for semi automated Pavement Condition Index (PCI) processing, however the algorithm resulted in a maximum F-Measure of 40%. This result is low and indicates the algorithm is not sufficient for fully automated PCI classification. Accurately detecting road defects using computer vision remains a challenging problem for future research. However, using Autonomous UAVs to collect the data is a viable avenue for data collection, theoretically faster than current methods at freeway speeds

    Optimum Allocation of Inspection Stations in Multistage Manufacturing Processes by Using Max-Min Ant System

    Get PDF
    In multistage manufacturing processes it is common to locate inspection stations after some or all of the processing workstations. The purpose of the inspection is to reduce the total manufacturing cost, resulted from unidentified defective items being processed unnecessarily through subsequent manufacturing operations. This total cost is the sum of the costs of production, inspection and failures (during production and after shipment). Introducing inspection stations into a serial multistage manufacturing process, although constituting an additional cost, is expected to be a profitable course of action. Specifically, at some positions the associated inspection costs will be recovered from the benefits realised through the detection of defective items, before wasting additional cost by continuing to process them. In this research, a novel general cost modelling for allocating a limited number of inspection stations in serial multistage manufacturing processes is formulated. In allocation of inspection station (AOIS) problem, as the number of workstations increases, the number of inspection station allocation possibilities increases exponentially. To identify the appropriate approach for the AOIS problem, different optimisation methods are investigated. The MAX-MIN Ant System (MMAS) algorithm is proposed as a novel approach to explore AOIS in serial multistage manufacturing processes. MMAS is an ant colony optimisation algorithm that was designed originally to begin an explorative search phase and, subsequently, to make a slow transition to the intensive exploitation of the best solutions found during the search, by allowing only one ant to update the pheromone trails. Two novel heuristics information for the MMAS algorithm are created. The heuristic information for the MMAS algorithm is exploited as a novel means to guide ants to build reasonably good solutions from the very beginning of the search. To improve the performance of the MMAS algorithm, six local search methods which are well-known and suitable for the AOIS problem are used. Selecting relevant parameter values for the MMAS algorithm can have a great impact on the algorithm’s performance. As a result, a method for tuning the most influential parameter values for the MMAS algorithm is developed. The contribution of this research is, for the first time, a methodology using MMAS to solve the AOIS problem in serial multistage manufacturing processes has been developed. The methodology takes into account the constraints on inspection resources, in terms of a limited number of inspection stations. As a result, the total manufacturing cost of a product can be reduced, while maintaining the quality of the product. Four numerical experiments are conducted to assess the MMAS algorithm for the AOIS problem. The performance of the MMAS algorithm is compared with a number of other methods this includes the complete enumeration method (CEM), rule of thumb, a pure random search algorithm, particle swarm optimisation, simulated annealing and genetic algorithm. The experimental results show that the effectiveness of the MMAS algorithm lies in its considerably shorter execution time and robustness. Further, in certain conditions results obtained by the MMAS algorithm are identical to the CEM. In addition, the results show that applying local search to the MMAS algorithm has significantly improved the performance of the algorithm. Also the results demonstrate that it is essential to use heuristic information with the MMAS algorithm for the AOIS problem, in order to obtain a high quality solution. It was found that the main parameters of MMAS include the pheromone trail intensity, heuristic information and evaporation of pheromone are less sensitive within the specified range as the number of workstations is significantly increased

    A Machine Learning Approach to Characterizing Particle Morphology in Nuclear Forensics

    Get PDF
    A machine learning approach is taken to characterizing a group of synthetic uranium bearing particles. SEM images of these lab-created particles were converted into a binary representation that captured morphological features in accordance with a guide established by Los Alamos National Laboratory. Each particle in the dataset contains an association with chemical creation conditions: processing method, precipitation temperature and pH, calcination temperature are most closely tied to particle morphology. Additionally, trained classifiers are able to relate final products between particles, implying that morphological features are shared between particles with similar composition

    The 1980 land cover for the Puget Sound region

    Get PDF
    Both LANDSAT imagery and the video information communications and retrieval software were used to develop a land cover classifiction of the Puget Sound of Washington. Planning agencies within the region were provided with a highly accurate land cover map registered to the 1980 census tracts which could subsequently be incorporated as one data layer in a multi-layer data base. Many historical activities related to previous land cover mapping studies conducted in the Puget Sound region are summarized. Valuable insight into conducting a project with a large community of users and in establishing user confidence in a multi-purpose land cover map derived from LANDSAT is provided

    Edge cross-section profile for colonoscopic object detection

    Get PDF
    Colorectal cancer is the second leading cause of cancer-related deaths, claiming close to 50,000 lives annually in the United States alone. Colonoscopy is an important screening tool that has contributed to a significant decline in colorectal cancer-related deaths. During colonoscopy, a tiny video camera at the tip of the endoscope generates a video signal of the internal mucosa of the human colon. The video data is displayed on a monitor for real-time diagnosis by the endoscopist. Despite the success of colonoscopy in lowering cancer-related deaths, a significant miss rate for detection of both large polyps and cancers is estimated around 4-12%. As a result, in recent years, many computer-aided object detection techniques have been developed with the ultimate goal to assist the endoscopist in lowering the polyp miss rate. Automatic object detection in recorded video data during colonoscopy is challenging due to the noisy nature of endoscopic images caused by camera motion, strong light reflections, the wide angle lens that cannot be automatically focused, and the location and appearance variations of objects within the colon. The unique characteristics of colonoscopy video require new image/video analysis techniques. The dissertation presents our investigation on edge cross-section profile (ECSP), a local appearance model, for colonoscopic object detection. We propose several methods to derive new features on ECSP from its surrounding region pixels, its first-order derivative profile, and its second-order derivative profile. These ECSP features describe discriminative patterns for different types of objects in colonoscopy. The new algorithms and software using the ECSP features can effectively detect three representative types of objects and extract their corresponding semantic unit in terms of both accuracy and analysis time. The main contributions of dissertation are summarized as follows. The dissertation presents 1) a new ECSP calculation method and feature-based ECSP method that extracts features on ECSP for object detection, 2) edgeless ECSP method that calculates ECSP without using edges, 3) part-based multi-derivative ECSP algorithm that segments ECSP, its 1st - order and its 2nd - order derivative functions into parts and models each part using the method that is suitable to that part, 4) ECSP based algorithms for detecting three representative types of colonoscopic objects including appendiceal orifices, endoscopes during retroflexion operations, and polyps and extracting videos or segmented shots containing these objects as semantic units, and 5) a software package that implements these techniques and provides meaningful visual feedback of the detected results to the endoscopist. Ideally, we would like the software to provide feedback to the endoscopist before the next video frame becomes available and to process video data at the rate in which the data are captured (typically at about 30 frames per second (fps)). This real-time requirement is difficult to achieve using today\u27s affordable off-the-shelf workstations. We aim for achieving near real-time performance where the analysis and feedback complete at the rate of at least 1 fps. The dissertation has the following broad impacts. Firstly, the performance study shows that our proposed ECSP based techniques are promising both in terms of the detection rate and execution time for detecting the appearance of the three aforementioned types of objects in colonoscopy video. Our ECSP based techniques can be extended to both detect other types of colonoscopic objects such as diverticula, lumen and vessel, and analyze other endoscopy procedures, such as laparoscopy, upper gastrointestinal endoscopy, wireless capsule endoscopy and EGD. Secondly, to our best knowledge, our polyp detection system is the only computer-aided system that can warn the endoscopist the appearance of polyps in near real time. Our retroflexion detection system is also the first computer-aided system that can detect retroflexion in near real-time. Retroflexion is a maneuver used by the endoscopist to inspect the colon area that is hard to reach. The use of our system in future clinical trials may contribute to the decline in the polyp miss rate during live colonoscopy. Our system may be used as a training platform for novice endoscopists. Lastly, the automatic documentation of detected semantic units of colonoscopic objects can be helpful to discover unknown patterns of colorectal cancers or new diseases and used as educational resources for endoscopic research

    Skylab-EREP studies in computer mapping of terrain in the Cripple Creek-Canon City area of Colorado

    Get PDF
    Multispectral-scanner data from satellites are used as input to computers for automatically mapping terrain classes of ground cover. Some major problems faced in this remote-sensing task include: (1) the effect of mixtures of classes and, primarily because of mixtures, the problem of what constitutes accurate control data, and (2) effects of the atmosphere on spectral responses. The fundamental principles of these problems are presented along with results of studies of them for a test site of Colorado, using LANDSAT-1 data

    Monitoring Animal Well-being

    Get PDF

    Assessing Binary Measurement Systems

    Get PDF
    Binary measurement systems (BMS) are widely used in both manufacturing industry and medicine. In industry, a BMS is often used to measure various characteristics of parts and then classify them as pass or fail, according to some quality standards. Good measurement systems are essential both for problem solving (i.e., reducing the rate of defectives) and to protect customers from receiving defective products. As a result, it is desirable to assess the performance of the BMS as well as to separate the effects of the measurement system and the production process on the observed classifications. In medicine, BMSs are known as diagnostic or screening tests, and are used to detect a target condition in subjects, thus classifying them as positive or negative. Assessing the performance of a medical test is essential in quantifying the costs due to misclassification of patients, and in the future prevention of these errors. In both industry and medicine, the most commonly used characteristics to quantify the performance a BMS are the two misclassification rates, defined as the chance of passing a nonconforming (non-diseased) unit, called the consumer's risk (false positive), and the chance of failing a conforming (diseased) unit, called the producer's risk (false negative). In most assessment studies, it is also of interest to estimate the conforming (prevalence) rate, i.e. probability that a randomly selected unit is conforming (diseased). There are two main approaches for assessing the performance of a BMS. Both approaches involve measuring a number of units one or more times with the BMS. The first one, called the "gold standard" approach, requires the use of a gold-standard measurement system that can determine the state of units with no classification errors. When a gold standard does not exist, is too expensive or time-consuming, another option is to repeatedly measure units with the BMS, and then use a latent class approach to estimate the parameters of interest. In industry, for both approaches, the standard sampling plan involves randomly selecting parts from the population of manufactured parts. In this thesis, we focus on a specific context commonly found in the manufacturing industry. First, the BMS under study is nondestructive. Second, the BMS is used for 100% inspection or any kind of systematic inspection of the production yield. In this context, we are likely to have available a large number of previously passed and failed parts. Furthermore, the inspection system typically tracks the number of parts passed and failed; that is, we often have baseline data about the current pass rate, separate from the assessment study. Finally, we assume that during the time of the evaluation, the process is under statistical control and the BMS is stable. Our main goal is to investigate the effect of using sampling plans that involve random selection of parts from the available populations of previously passed and failed parts, i.e. conditional selection, on the estimation procedure and the main characteristics of the estimators. Also, we demonstrate the value of combining the additional information provided by the baseline data with those collected in the assessment study, in improving the overall estimation procedure. We also examine how the availability of baseline data and using a conditional selection sampling plan affect recommendations on the design of the assessment study. In Chapter 2, we give a summary of the existing estimation methods and sampling plans for a BMS assessment study in both industrial and medical settings, that are relevant in our context. In Chapters 3 and 4, we investigate the assessment of a BMS in the case where we assume that the misclassification rates are common for all conforming/nonconforming parts and that repeated measurements on the same part are independent, conditional on the true state of the part, i.e. conditional independence. We call models using these assumptions fixed-effects models. In Chapter 3, we look at the case where a gold standard is available, whereas in Chapter 4, we investigate the "no gold standard" case. In both cases, we show that using a conditional selection plan, along with the baseline information, substantially improves the accuracy and precision of the estimators, compared to the standard sampling plan. In Chapters 5 and 6, we investigate the case where we allow for possible variation in the misclassification rates within conforming and nonconforming parts, by proposing some new random-effects models. These models relax the fixed-effects model assumptions regarding constant misclassification rates and conditional independence. As in the previous chapters, we focus on investigating the effect of using conditional selection and baseline information on the properties of the estimators, and give study design recommendations based on our findings. In Chapter 7, we discuss other potential applications of the conditional selection plan, where the study data are augmented with the baseline information on the pass rate, especially in the context where there are multiple BMSs under investigation
    • …
    corecore