282 research outputs found

    Integrating product function design, production technology optimization and process equipment planning on the example of hybrid additive manufacturing

    Get PDF
    New technologies can yield high market potential, but also challenge engineering capabilities. For example, additive manufacturing enables unlimited freedom of design and economical production of small batch sizes. However, there are huge challenges: A large variety of new additive technologies, limited choice of materials and mostly high production cost as result of long production time. Since today’s production requires an economical implementation, focus needs to be on hybrid production, which combines the advantages of additive and conventional manufacturing technologies. This requires an integrated optimization of the product design, the manufacturing technology chain and the operative equipment. The following paper presents an approach for this integrated planning approach with the aim of economically feasible hybrid production. In general, the interdependencies between product and manufacturing technology need to be used for optimization in early stages of the product life cycle. To achieve a high customer value, the product requirements have to be analyzed in detail to find an optimal product function, but also to identify degrees of design freedom, which do not influence product function and, thus, can be adapted to optimize production. Moreover, possible changes in the capabilities of manufacturing technologies and, subsequently, operative equipment and machines can be anticipated to further enhance the production. After identifying optimal combinations of product design and manufacturing technology chains, the selection and optimal configuration of the operative equipment is necessary and needs to be validate based on the final product design. The integration of product design, manufacturing technology optimization and operative process planning enables companies to identify and realize high economic potential early in their value creation process and thus can contribute to improving competitiveness

    Deep Learning for Automated Product Design

    Get PDF
    Product development is a highly complex process that has to be individually adapted depending on the companies involved, the product to be developed and the related designers. Within this process, the approach and the know-how of the designer are very individual and can often only be described with high effort in a rule-based manner. Nevertheless, numerous routine tasks can be identified that offer enormous automation potential. Machine Learning, especially Deep Learning, has proven an immense capability to identify patterns and extract knowledge out of complex data sets. Autoencoder networks are suitable for the conversion of different 3D input data, e.g. Point Clouds, into compact latent representations and vice versa. Point Clouds are a universal representation of 3D objects and can be derived from various 3D data formats. The goal of the approach presented is to use Deep Learning algorithms to identify design patterns specific to a product family out of their underlying latent representation and use the extracted knowledge to automatically generate new latent object representations fulfilling distinct product feature specifications. A deep Autoencoder network with state-of-the-art reconstruction quality is used to encode Point Clouds into latent representations. In this approach, a conditional Generative Adversarial Network operating in latent space for generation of class-, characteristic- and dimension-conditioned objects is introduced. The model is quantitatively evaluated by a comparison of given specifications and the implemented features of generated objects. The presented findings can be used to support designers in the creation process by automatically proposing appropriate objects as well as in the adaption of future product variants to different requirements. This relieves the designer of time-consuming routine tasks and reduces the effort of knowledge-transfer between designers significantly

    Attitudes of medical students towards interprofessional education:A mixed-methods study

    Get PDF
    BackgroundInterprofessional Education (IPE) aims to improve students' attitudes towards collaboration, teamwork, and leads to improved patient care upon graduation. However, the best time to introduce IPE into the undergraduate curriculum is still under debate.MethodsWe used a mixed-methods design based on a sequential explanatory model. Medical students from all six years at the University of Bern, Switzerland (n = 683) completed an online survey about attitudes towards interprofessional learning using a scale validated for German speakers (G-IPAS). Thirty-one medical students participated in nine semi-structured interviews focusing on their experience in interprofessional learning and on the possible impact it might have on their professional development.ResultsWomen showed better attitudes in the G-IPAS across all years (p = 0,007). Pre-clinical students showed more positive attitudes towards IPE [Year 1 to Year 3 (p = 0.011)]. Students correctly defined IPE and its core dimensions. They appealed for more organized IPE interventions throughout the curriculum. Students also acknowledged the relevance of IPE for their future professional performance.ConclusionsThese findings support an early introduction of IPE into the medical curriculum. Although students realise that interprofessional learning is fundamental to high-quality patient care, there are still obstacles and stereotypes to overcome.Trial registrationISRCTN 41715934

    Potenzialflächenbewertung und -ranking zur nachhaltigen Entwicklung der Region Bonn/Rhein-Sieg/Ahrweiler

    Get PDF
    Das im Rahmen des BMBF-geförderten Projekts NEILA (Nachhaltige Entwicklung durch interkommunales Landmanagement) entwickelte Flächenbewertungs- und Rankingsystem in der Region Bonn/Rhein-Sieg/Ahrweiler basiert auf interkommunal abgestimmten Kriterien und liefert eine Entscheidungshilfe zur Priorisierung und Entwicklung von Potenzialflächen für die Siedlungsentwicklung. Die Erarbeitung des Systems fand in engem Austausch mit Planer*innen und Entscheidungsträger*innen der Region statt, um lokales Expertenwissen zu nutzen und die Akzeptanz der Flächenbewertung zu steigern. Die bewerteten Potenzialflächen sind aus verschiedenen Quellen – Monitoringsysteme der Länder, vorhandene Konzepte, Restriktionsanalyse - und im intensiven Dialog mit den Kommunalverwaltungen zusammengetragen. Das Ergebnis ist ein Flächenranking, welches den Kommunen in der Region in einem Web-GIS zur Verfügung steht. Es ist bearbeit- und erweiterbar und stellt eine Vielzahl von Informationen für die in der Region erhobenen Flächen dar. Für jede Fläche ist ersichtlich, wie die jeweiligen Kriterien (beispielsweise ÖPNV-Erreichbarkeit, Nahversorgung oder die Nähe zu Grünflächen) sowie die darauf basierenden Indikatoren (Wohn-, Gewerbeeignung, Freiraumbedeutung) ausgeprägt sind. Mithilfe des Systems lassen sich sowohl auf kommunaler als auch regionaler Ebene Flächen priorisieren und miteinander vergleichen. Es trägt dazu bei, die zukünftige regionale Siedlungsentwicklung anhand objektiver Kriterien und einer nachvollziehbaren Methodik nachhaltig zu gestalten

    Efficient implementation of atom-density representations

    Get PDF
    Physically motivated and mathematically robust atom-centered representations of molecular structures are key to the success of modern atomistic machine learning. They lie at the foundation of a wide range of methods to predict the properties of both materials and molecules and to explore and visualize their chemical structures and compositions. Recently, it has become clear that many of the most effective representations share a fundamental formal connection. They can all be expressed as a discretization of n-body correlation functions of the local atom density, suggesting the opportunity of standardizing and, more importantly, optimizing their evaluation. We present an implementation, named librascal, whose modular design lends itself both to developing refinements to the density-based formalism and to rapid prototyping for new developments of rotationally equivariant atomistic representations. As an example, we discuss smooth overlap of atomic position (SOAP) features, perhaps the most widely used member of this family of representations, to show how the expansion of the local density can be optimized for any choice of radial basis sets. We discuss the representation in the context of a kernel ridge regression model, commonly used with SOAP features, and analyze how the computational effort scales for each of the individual steps of the calculation. By applying data reduction techniques in feature space, we show how to reduce the total computational cost by a factor of up to 4 without affecting the model’s symmetry properties and without significantly impacting its accuracy

    Characterizing quantum instruments: from non-demolition measurements to quantum error correction

    Full text link
    In quantum information processing quantum operations are often processed alongside measurements which result in classical data. Due to the information gain of classical measurement outputs non-unitary dynamical processes can take place on the system, for which common quantum channel descriptions fail to describe the time evolution. Quantum measurements are correctly treated by means of so-called quantum instruments capturing both classical outputs and post-measurement quantum states. Here we present a general recipe to characterize quantum instruments alongside its experimental implementation and analysis. Thereby, the full dynamics of a quantum instrument can be captured, exhibiting details of the quantum dynamics that would be overlooked with common tomography techniques. For illustration, we apply our characterization technique to a quantum instrument used for the detection of qubit loss and leakage, which was recently implemented as a building block in a quantum error correction (QEC) experiment (Nature 585, 207-210 (2020)). Our analysis reveals unexpected and in-depth information about the failure modes of the implementation of the quantum instrument. We then numerically study the implications of these experimental failure modes on QEC performance, when the instrument is employed as a building block in QEC protocols on a logical qubit. Our results highlight the importance of careful characterization and modelling of failure modes in quantum instruments, as compared to simplistic hardware-agnostic phenomenological noise models, which fail to predict the undesired behavior of faulty quantum instruments. The presented methods and results are directly applicable to generic quantum instruments.Comment: 28 pages, 21 figure
    • …
    corecore