337 research outputs found

    Novel Phase Between Band and Mott Insulators in Two Dimensions

    Full text link
    We investigate the ground state phase diagram of the half-filled repulsive Hubbard model in two dimensions in the presence of a staggered potential Δ\Delta, the so-called ionic Hubbard model, using cluster dynamical mean field theory. We find that for large Coulomb repulsion, UΔU\gg \Delta, the system is a Mott insulator (MI). For weak to intermediate values of Δ\Delta, on decreasing UU, the Mott gap closes at a critical value Uc1(Δ)U_{c1}(\Delta) beyond which a correlated insulating phase with possible bond order (BO) is found. Further, this phase undergoes a first-order transition to a band insulator (BI) at Uc2(Δ)U_{c2}(\Delta) with a finite charge gap at the transition. For large Δ\Delta, there is a direct first-order transition from a MI to a BI with a single metallic point at the phase boundary

    Effect of Humidity and testing strategy on Friction Performance of model brake pads containing Nano-additives

    Get PDF
    Interaction of brakes with external environment can considerably influence their performance which could relate to friction instabilities. Humidity can alter the chemistry of friction surfaces that could relate to unwanted phenomena which increase the cost of product. In addition to the chemical phenomena leading to unwanted reactions, there are physical effects related to adsorption of humidity and to modification of adhesion, accompanied with changes in contact surfaces and contact mechanics. The goal of this paper is to address these chemical and physical phenomena occurring at friction interfaces of model friction materials and to relate them to their performance. Friction tests were performed by using the environmental (equipped with humidity and temperature chambers) bench-top UMT TriboLab friction tester (Bruker) with scaled-down parameters derived from adopted real vehicle braking scenario. Wear surfaces/mechanisms were studied by using scanning electron microscopy (Quanta FEG 450 by FEI) equipped with the energy dispersive X-ray microanalysis (Inca System), and 3D optical microscope (NPFLEX by Bruker). Vibrational response was monitored by triaxial ICP Accelerometer (PCB Piezotronics, Model=356A45) and Oscilloscope (Agilent Technologies, Model= MSOX2024A). The data were analyzed by use of Matlab (MathWorks, Version= R2015A). The physical adsorption is dependent strongly on the surface topography; nevertheless, the chemical species/products generated at the friction surfaces are dominant factor dictating the quantity of absorbed humidity/species. Their chemistry differs from the chemistry of bulk and a complex correlation must be further studied. Adsorption considerably influences the friction performance (friction and its stability, wear, noise and environmental response/pollution capacity) of brake pads. Further studies addressing these phenomena are recommended. Keywords – humidity, friction, scanning electron microscopy, brake pads and X-ray microanalysis

    Rapid, Reliable Tissue Fractionation Algorithm for Commercial Scale Biorefineries

    Get PDF
    Increasing demand, limited supply, and the impact on the environment raise significant concerns about the consumption of fossil fuels. Because of this, global economies are facing two significant energy challenges: i) securing the supply of reliable and affordable energy and ii) achieving the transformation to a low-carbon, high-efficiency, and sustainable energy system. Recently, there has been growing interest in developing portable transportation fuels from biomass in order to reduce the petroleum consumption in the transportation sector - a major contributor to greenhouse gas emission. A cost-effective conversion process to produce biofuels from lignocellulosic biomass material relies not just on the material quality, but also on the biorefinery’s ability to measure the quality of the source biomass. The quality of the feedstock is crucial for a commercially viable conversion platform. This research mainly focuses on developing sensing techniques using 3D X-ray imaging to study quality factors like material composition, ash content and moisture content which affect the conversion efficiency, equipment wear, and product yield in the bioethanol production in a real-time or near real-time basis

    Large deflection analysis of composite beams

    Get PDF
    A beam made of composite material undergoing large deflections is analyzed based on a higher-order shear deformation theory. Composite materials offer several advantages over conventional materials in the form of improved strength to weight ratio, high impact strength, corrosion resistance, and design flexibility. The Euler-Bernoulli beam theory is valid only for small deflections and may be too restrictive in a number of applications. The formulation of the large deflection analysis of composite beams is carried out using the principle of virtual work. The spatial discretization is done using an h-p version finite element method. The nonlinear large deflection equations are solved using an iterative process. Results are presented in the form of deflections as a function of position

    Cluster Dynamical Mean-Field Theory of the density-driven Mott transition in the one-dimensional Hubbard model

    Full text link
    The one-dimensional Hubbard model is investigated by means of two different cluster schemes suited to introduce short-range spatial correlations beyond the single-site Dynamical Mean-Field Theory, namely the Cluster-Dynamical Mean-Field Theory and its periodized version. It is shown that both cluster schemes are able to describe with extreme accuracy the evolution of the density as a function of the chemical potential from the Mott insulator to the metallic state. Using exact diagonalization to solve the cluster impurity model, we discuss the role of the truncation of the Hilbert space of the bath, and propose an algorithm that gives higher weights to the low frequency hybridization matrix elements and improves the speed of the convergence of the algorithm.Comment: 6 pages, 4 figures, minor corrections in v

    Applying Lean Six Sigma and Systematic Layout Planning to Improve Patient Transportation Equipment Storage in an Acute Care Hospital

    Get PDF
    Purpose: The purpose of this project was to optimize the patient transportation process at an acute care hospital to achieve reduced transportation times. Methodology: A detailed Lean Six Sigma study on the patient transport and equipment handling processes helped to determine possible ways to reduce the equipment handling time which eventually reduces the patient transportation time. The Systematic Layout Planning (SLP) approach usually applied in manufacturing environments was used to identify which patient transport equipment was needed to be stored in which locations throughout the hospital footprint. The assignment of equipment to locations was determined based on frequency of use, distance, and equipment type. Findings: Key challenges were identified as lack of traceability of equipment, insufficient storage locations and storage locations with inappropriate equipment. Through SLP and statistical analysis of patient transport data, pickup locations were identified to minimize distance for high frequency trips for each mode of transport. Limitations: We provided the recommendations to the hospital to implement, but due to COVID pandemic resource issues they had not yet implemented the recommendations, although they are still planning to do so. Practical Implications and Originality/Value of Paper: The usage of the SLP approach combined with the Lean Six Sigma DMAIC method and tools was applied in the hospital environment to potentially reduce patient transport times, in what appears to be the first such research study applied to a hospital’s patient transportation system. Keywords: Patient transportation, Equipment, Systematic Layout Planning, Healthcare, Lean Six Sigma Paper Type: Case Stud

    Quality Aware Generative Adversarial Networks

    Full text link
    Generative Adversarial Networks (GANs) have become a very popular tool for implicitly learning high-dimensional probability distributions. Several improvements have been made to the original GAN formulation to address some of its shortcomings like mode collapse, convergence issues, entanglement, poor visual quality etc. While a significant effort has been directed towards improving the visual quality of images generated by GANs, it is rather surprising that objective image quality metrics have neither been employed as cost functions nor as regularizers in GAN objective functions. In this work, we show how a distance metric that is a variant of the Structural SIMilarity (SSIM) index (a popular full-reference image quality assessment algorithm), and a novel quality aware discriminator gradient penalty function that is inspired by the Natural Image Quality Evaluator (NIQE, a popular no-reference image quality assessment algorithm) can each be used as excellent regularizers for GAN objective functions. Specifically, we demonstrate state-of-the-art performance using the Wasserstein GAN gradient penalty (WGAN-GP) framework over CIFAR-10, STL10 and CelebA datasets.Comment: 10 pages, NeurIPS 201
    corecore