297,620 research outputs found

    Processing model for tungsten powders and extention to nanoscale size range

    Get PDF
    Nanoscale tungsten powders promise access to very hard, strong and wear resistant materials via the press–sinter route. A small particle size changes the response during sintering, requiring lower temperatures and shorter times to attain dense but small grain size structures. On the other hand, oxide reduction and impurity evaporation favour high sintering temperatures and long hold times. Accordingly, press–sinter processing encounters conflicting constraints when applied to small particles. Presented here is an analysis of press–sinter tungsten particle processing to isolate conditions that balance the temperature and size dependent effects. The calculations are pinned by existing data. Opportunities are identified for new consolidation approaches to deliver a small grain size in a full density structure

    Symbolic Partial-Order Execution for Testing Multi-Threaded Programs

    Full text link
    We describe a technique for systematic testing of multi-threaded programs. We combine Quasi-Optimal Partial-Order Reduction, a state-of-the-art technique that tackles path explosion due to interleaving non-determinism, with symbolic execution to handle data non-determinism. Our technique iteratively and exhaustively finds all executions of the program. It represents program executions using partial orders and finds the next execution using an underlying unfolding semantics. We avoid the exploration of redundant program traces using cutoff events. We implemented our technique as an extension of KLEE and evaluated it on a set of large multi-threaded C programs. Our experiments found several previously undiscovered bugs and undefined behaviors in memcached and GNU sort, showing that the new method is capable of finding bugs in industrial-size benchmarks.Comment: Extended version of a paper presented at CAV'2

    An integrated approach for lean production using simulation and data envelopment analysis

    Get PDF
    According to the extant literature, improving the leanness of a production system boosts a company’s productivity and competitiveness. However, such an endeavor usually involves managing multiple, potentially conflicting objectives. This study proposes a framework that analyzes lean production methods using simulation and data envelopment analysis (DEA) to accommodate the underlying multi-objective decision-making problem. The proposed framework can help identify the most efficient solution alternative by (i) considering the most common lean production methods for assembly line balancing, such as single minute exchange of dies (SMED) and multi-machine set-up reduction (MMSUR), (ii) creating and simulating various alternative assembly line configuration options via discrete-event simulation modeling, and (iii) formulating and applying DEA to identify the best alternative assembly system configuration for the multi-objective decision making. In this study, we demonstrate the viability and superiority of the proposed framework with an application case on an automotive spare parts production system. The results show that the suggested framework substantially improves the existing system by increasing efficiency while concurrently decreasing work-in-process (WIP).Q10006980682000012-s2.0-8511520297

    17-11 Evaluation of Transit Priority Treatments in Tennessee

    Get PDF
    Many big cities are progressively implementing transit friendly corridors especially in urban areas where traffic may be increasing at an alarming rate. Over the years, Transit Signal Priority (TSP) has proven to be very effective in creating transit friendly corridors with its ability to improve transit vehicle travel time, serviceability and reliability. TSP as part of Transit Oriented Development (TOD) is associated with great benefits to community liveability including less environmental impacts, reduced traffic congestions, fewer vehicular accidents and shorter travel times among others.This research have therefore analysed the impact of TSP on bus travel times, late bus recovery at bus stop level, delay (on mainline and side street) and Level of Service (LOS) at intersection level on selected corridors and intersections in Nashville Tennessee; to solve the problem of transit vehicle delay as a result of high traffic congestion in Nashville metropolitan areas. This study also developed a flow-delay model to predict delay per vehicle for a lane group under interrupted flow conditions and compared some measure of effectiveness (MOE) before and after TSP. Unconditional green extension and red truncation active priority strategies were developed via Vehicle Actuated Programming (VAP) language which was tied to VISSIM signal controller to execute priority for transit vehicles approaching the traffic signal at 75m away from the stop line. The findings from this study indicated that TSP will recover bus lateness at bus stops 25.21% to 43.1% on the average, improve bus travel time by 5.1% to 10%, increase side street delay by 15.9%, and favour other vehicles using the priority approach by 5.8% and 11.6% in travel time and delay reduction respectively. Findings also indicated that TSP may not affect LOS under low to medium traffic condition but LOS may increase under high traffic condition

    Environmental impact of combined ITS traffic management strategies

    Get PDF
    Transport was responsible for 20% of the total greenhouse gas emissions in Europe during 2011 (European Environmental Agency 2013) with road transport being the key contributor. To tackle this, targets have been established in Europe and worldwide to curb transport emissions. This poses a significant challenge on Local Government and transport operators who need to identify a set of effective measures to reduce the environmental impact of road transport and at the same time keep the traffic smooth. Of the road transport pollutants, this paper considers NOx, CO2 and black carbon (BC). A particular focus is put on black carbon, which is formed through incomplete combustion of carboneous materials, as it has a significant impact on the Earth’s climate system. It absorbs solar radiation, influences cloud processes, and alters the melting of snow and ice cover (Bond et al. 2013). BC also causes serious health concerns: black carbon is associated with asthma and other respiratory problems, heart attacks and lung cancer (Sharma 2010; United States Environmental Protection Agency 2012). Since BC emissions are mainly produced during the decelerating and accelerating phases (Zhang et al. 2009), ITS actions able to reduce stop&go phases have the potential to reduce BC emissions. This paper investigates the effectiveness of combined ITS actions in urban context in reducing CO2 and BC emissions and improving traffic conditions

    Model-Based Proactive Read-Validation in Transaction Processing Systems

    Get PDF
    Concurrency control protocols based on read-validation schemes allow transactions which are doomed to abort to still run until a subsequent validation check reveals them as invalid. These late aborts do not favor the reduction of wasted computation and can penalize performance. To counteract this problem, we present an analytical model that predicts the abort probability of transactions handled via read-validation schemes. Our goal is to determine what are the suited points-along a transaction lifetime-to carry out a validation check. This may lead to early aborting doomed transactions, thus saving CPU time. We show how to exploit the abort probability predictions returned by the model in combination with a threshold-based scheme to trigger read-validations. We also show how this approach can definitely improve performance-leading up to 14 % better turnaround-as demonstrated by some experiments carried out with a port of the TPC-C benchmark to Software Transactional Memory

    Deterministic Consistency: A Programming Model for Shared Memory Parallelism

    Full text link
    The difficulty of developing reliable parallel software is generating interest in deterministic environments, where a given program and input can yield only one possible result. Languages or type systems can enforce determinism in new code, and runtime systems can impose synthetic schedules on legacy parallel code. To parallelize existing serial code, however, we would like a programming model that is naturally deterministic without language restrictions or artificial scheduling. We propose "deterministic consistency", a parallel programming model as easy to understand as the "parallel assignment" construct in sequential languages such as Perl and JavaScript, where concurrent threads always read their inputs before writing shared outputs. DC supports common data- and task-parallel synchronization abstractions such as fork/join and barriers, as well as non-hierarchical structures such as producer/consumer pipelines and futures. A preliminary prototype suggests that software-only implementations of DC can run applications written for popular parallel environments such as OpenMP with low (<10%) overhead for some applications.Comment: 7 pages, 3 figure
    • …
    corecore