4,341 research outputs found

    Implementing Flow Processing with Product End of Life Remanufacturing

    Get PDF
    This research focuses on improving the remanufacturing process efficiency by estimating the workstation utilization through identifying percentage of %Blocking and %Waiting on individual workstations within a remanufacturing flow line. It attempts to achieve this aim such that improved use of methods to overcome the effect of variability can be employed. Extensive literature review revealed the requirement of strategies to recover End of Life products due to the introduction and implementation of legislative directives demanding manufacturers to recover the End of Life resources. Upon analyzing the range of product recovery strategies, End of Life product remanufacturing has emerged as an appropriate and suitable strategy to be used since it extends the operational life of existing products without the need for the new resources required when making products. Remanufacturing is a process in which a product is disassembled to component level. Each of the components will be thoroughly examined for defects. Upon identifying defects, they will either be repaired or components will be replaced. This process in turn increases the product life span. However, remanufacturing is not widely used process applied into various industry sectors due to the fact that it is labour intensive and expensive process compared to new products. Although remanufacturing process is in infancy where small number of industry such as Automotive and Aerospace are deriving benefit from it by making effective use of remanufacturing. Ideally, the suitable manufacturing methods i.e. flow processing system, should be used to remanufacture products. However when flow processing is deployed, it is found that there are a number of factors affecting the process that if not tackled, will result in poor performance and poor efficiency of the overall remanufacturing system. This inefficiency is primarily due to the number of sources of variation found in terms of supply, product design, parts specification, operation and demand variability. Further investigation led to the characterizing the remanufacturing variability and identified ways the effect of this variability can be removed or reduced using Lean principles e.g. Single Minute Exchange of Dies and use of an appropriate manufacturing system. Based on the information revised in literature and experimental design, novel equations were developed along with a set of rules that accurately measures the workstation utilization in terms of %Blocking and %Waiting on individual workstation.EPSR

    Autonomous Finite Capacity Scheduling using Biological Control Principles

    Get PDF
    The vast majority of the research efforts in finite capacity scheduling over the past several years has focused on the generation of precise and almost exact measures for the working schedule presupposing complete information and a deterministic environment. During execution, however, production may be the subject of considerable variability, which may lead to frequent schedule interruptions. Production scheduling mechanisms are developed based on centralised control architecture in which all of the knowledge base and databases are modelled at the same location. This control architecture has difficulty in handling complex manufacturing systems that require knowledge and data at different locations. Adopting biological control principles refers to the process where a schedule is developed prior to the start of the processing after considering all the parameters involved at a resource involved and updated accordingly as the process executes. This research reviews the best practices in gene transcription and translation control methods and adopts these principles in the development of an autonomous finite capacity scheduling control logic aimed at reducing excessive use of manual input in planning tasks. With autonomous decision-making functionality, finite capacity scheduling will as much as practicably possible be able to respond autonomously to schedule disruptions by deployment of proactive scheduling procedures that may be used to revise or re-optimize the schedule when unexpected events occur. The novelty of this work is the ability of production resources to autonomously take decisions and the same way decisions are taken by autonomous entities in the process of gene transcription and translation. The idea has been implemented by the integration of simulation and modelling techniques with Taguchi analysis to investigate the contributions of finite capacity scheduling factors, and determination of the ‘what if’ scenarios encountered due to the existence of variability in production processes. The control logic adopts the induction rules as used in gene expression control mechanisms, studied in biological systems. Scheduling factors are identified to that effect and are investigated to find their effects on selected performance measurements for each resource in used. How they are used to deal with variability in the process is one major objective for this research as it is because of the variability that autonomous decision making becomes of interest. Although different scheduling techniques have been applied and are successful in production planning and control, the results obtained from the inclusion of the autonomous finite capacity scheduling control logic has proved that significant improvement can still be achieved

    NASA SBIR abstracts of 1990 phase 1 projects

    Get PDF
    The research objectives of the 280 projects placed under contract in the National Aeronautics and Space Administration (NASA) 1990 Small Business Innovation Research (SBIR) Phase 1 program are described. The basic document consists of edited, non-proprietary abstracts of the winning proposals submitted by small businesses in response to NASA's 1990 SBIR Phase 1 Program Solicitation. The abstracts are presented under the 15 technical topics within which Phase 1 proposals were solicited. Each project was assigned a sequential identifying number from 001 to 280, in order of its appearance in the body of the report. The document also includes Appendixes to provide additional information about the SBIR program and permit cross-reference in the 1990 Phase 1 projects by company name, location by state, principal investigator, NASA field center responsible for management of each project, and NASA contract number

    Designing a robust production system for erratic demand environments.

    Get PDF
    Production systems must have the right type of material in the right quantities when required for production. They must minimize the work in progress while ensuring no stock-outstock-out occurs. While these twin opposing goals are achievable when demand is stable, they are difficult to realize under an erratic demand pattern. This dissertation aims to develop a production system that can meet erratic demands with minimal costs or errors. After a detailed introduction to the problem considered, we review the relevant literature. We then conduct a numerical analysis of current production systems, identify their deficiencies, and then present our solution to address these deficiencies via the ARK (Automated Replenishment System) technique. This technique is applied to a real-world problem at Methode Engineering ©. We conclude by detailing the scientific benefit of our technique and proposing ideas for future research

    Constraint-Based Supply Chain Inventory Deployment Strategies

    Get PDF
    The development of Supply Chain Management has occurred gradually over the latter half of the last century, and in this century will continue to evolve in response to the continual changes in the business environment. As organizations exhaust opportunities for internal breakthrough improvements, they will increasingly turn toward the supply chain for an additional source of untapped improvements. Manufacturers in particular can benefit from this increased focus on the chain, but the gains realized will vary by the type of supply chain. By applying basic production control principles to the chain, and effectively using tools already common at the production line level, organizations address important supply chain considerations. Both the Theory of Constraints and the factory physics principles behind the Constant WIP concepts focus on the system constraint with the aim of controlling inventory. Each can be extrapolated to focus on a system whose boundaries span the entire supply chain

    Research and Technology

    Get PDF
    Langley Research Center is engaged in the basic an applied research necessary for the advancement of aeronautics and space flight, generating advanced concepts for the accomplishment of related national goals, and provding research advice, technological support, and assistance to other NASA installations, other government agencies, and industry. Highlights of major accomplishments and applications are presented

    Engineering Education and Research Using MATLAB

    Get PDF
    MATLAB is a software package used primarily in the field of engineering for signal processing, numerical data analysis, modeling, programming, simulation, and computer graphic visualization. In the last few years, it has become widely accepted as an efficient tool, and, therefore, its use has significantly increased in scientific communities and academic institutions. This book consists of 20 chapters presenting research works using MATLAB tools. Chapters include techniques for programming and developing Graphical User Interfaces (GUIs), dynamic systems, electric machines, signal and image processing, power electronics, mixed signal circuits, genetic programming, digital watermarking, control systems, time-series regression modeling, and artificial neural networks

    A DETECTION AND DATA ACQUISITION SYSTEM FOR PRECISION BETA DECAY SPECTROSCOPY

    Get PDF
    Free neutron and nuclear beta decay spectroscopy serves as a robust laboratory for investigations of the Standard Model of Particle Physics. Observables such as decay product angular correlations and energy spectra overconstrain the Standard Model and serve as a sensitive probe for Beyond the Standard Model physics. Improved measurement of these quantities is necessary to complement the TeV scale physics being conducted at the Large Hadron Collider. The UCNB, 45Ca, and Nab experiments aim to improve upon existing measurements of free neutron decay angular correlations and set new limits in the search for exotic couplings in beta decay. To achieve these experimental goals, a highly-pixelated, thick silicon detector with a 100 nm entrance window has been developed for precision beta spectroscopy and the direct detection of 30 keV beta decay protons. The detector has been characterized for its performance in energy reconstruction and particle arrival time determination. A Monte Carlo simulation of signal formation in the silicon detector and propagation through the electronics chain has been written to develop optimal signal analysis algorithms for minimally biased energy and timing extraction. A tagged-electron timing test has been proposed and investigated as a means to assess the validity of these Monte Carlo efforts. A universal platform for data acquisition (DAQ) has been designed and implemented in National Instrument\u27s PXIe-5171R digitizer/FPGA hardware. The DAQ retains a ring buffer of the most recent 400 ms of data in all 256 channels, so that a waveform trace can be returned from any combination of pixels and resolution for complete energy reconstruction. Low-threshold triggers on individual channels were implemented in FPGA as a generic piecewise-polynomial filter for universal, real-time digital signal processing, which allows for arbitrary filter implementation on a pixel-by-pixel basis. This system is universal in the sense that it has complete flexible, complex, and debuggable triggering at both the pixel and global level without recompiling the firmware. The culmination of this work is a system capable of a 10 keV trigger threshold, 3 keV resolution, and maximum 300 ps arrival time systematic, even in the presence of large amplitude noise components

    Assembly Line

    Get PDF
    An assembly line is a manufacturing process in which parts are added to a product in a sequential manner using optimally planned logistics to create a finished product in the fastest possible way. It is a flow-oriented production system where the productive units performing the operations, referred to as stations, are aligned in a serial manner. The present edited book is a collection of 12 chapters written by experts and well-known professionals of the field. The volume is organized in three parts according to the last research works in assembly line subject. The first part of the book is devoted to the assembly line balancing problem. It includes chapters dealing with different problems of ALBP. In the second part of the book some optimization problems in assembly line structure are considered. In many situations there are several contradictory goals that have to be satisfied simultaneously. The third part of the book deals with testing problems in assembly line. This section gives an overview on new trends, techniques and methodologies for testing the quality of a product at the end of the assembling line

    Beyond Lean and the Working Environment

    Get PDF
    Lean Production System (LPS) has become very popular among manufacturing industries, services and large commercial areas over the years due to its production increase abilities. However, LPS practices can have both negative and positive impacts in worker’s psychosocial factors like motivation, satisfaction and commitment and physical and psychological health factor like musculoskeletal disorders (MSD) and stress. Since LPS is a very broad term, there is no simple relation between LPS implementation and its consequences over work environment and workers. Therefore, it is necessary to study the different factors that can affect the work environment in each case. A wide variety of LPS practices can have negative and positive impacts on workers. Furthermore, the effects of lean may also depend on the sector and country in which it is implemented. There are no studies in the literature that cover all these effects and analyse them together with the involved environment. In this study, articles were collected in scientific publications in the last 26 years and analysed. Results show that Just-in-Time (JIT) practices are strongly related with negative effects in MSDs and stress caused by intensification of work and increase of control over workers. However, JIT practices such as manufacturing cells can increase job enrichment trough multi-skilling. Respect for people practices can act as buffers to lean practices. Job rotation reduces human effort and work pace trough the increase of recovery time. Workgroups create job support acting as buffers to psychosocial factors. Results show a majority of negative effects in the automotive sector and in countries such as Canada, USA and UK. Scandinavian countries have implemented hybrid forms of Lean which are related to an increase in effects such as motivation and job satisfaction. However, the overall analysis is that the effects of lean on workers depend more on the way companies manage and implement it rather than the countries cultural factors. This study can be useful for managers and leaders who seek to transform traditional enterprises into exemplars of lean success, showing the need to balance lean and good working conditions
    • …
    corecore