169 research outputs found

    Scalabale Path Level Thermal History Simulation of PBF process validated by Melt Pool Images

    Full text link
    In this paper we outline the development of a scalable PBF thermal history simulation built on CAPL and based on melt pool physics and dynamics. The new approach inherits linear scalability from CAPL and has three novel ingredients. Firstly, to simulate the laser scanning on a solid surface, we discretize the entire simulation domain instead of only the manufacturing toolpath by appending fictitious paths to the manufacturing toolpath. Secondly, to simulate the scanning on overlapping toolpaths, the path-scale simulations are initialized by a Voronoi diagram for line segments discretized from the manufacturing toolpath. Lastly, we propose a modified conduction model that considers the high thermal gradient around the melt pool. We validate the simulation against melt pool images captured with the co-axial melt pool monitoring (MPM) system on the NIST Additive Manufacturing Metrology Testbed (AMMT). Excellent agreements in the length and width of melt pools are found between simulations and experiments conducted on a custom-controlled laser powder bed fusion (LPBF) testbed on a nickel-alloy (IN625) solid surface. To the authors' best knowledge, this paper is the first to validate a full path-scale thermal history with experimentally acquired melt pool images. Comparing the simulation results and the experimental data, we discuss the influence of laser power on the melt pool length on the path-scale level. We also identify the possible ways to further improve the accuracy of the CAPL simulation without sacrificing efficiency

    AIERO: An algorithm for identifying engineering relationships in ontologies

    Get PDF
    Semantic technologies are playing an increasingly popular role as a means for advancing the capabilities of knowledge management systems. Among these advancements, researchers have successfully leveraged semantic technologies, and their accompanying techniques, to improve the representation and search capabilities of knowledge management systems. This paper introduces a further application of semantic techniques. We explore semantic relatedness as a means of facilitating the development of more “intelligent” engineering knowledge management systems. Using semantic relatedness quantifications to analyze and rank concept pairs, this novel approach exploits semantic relationships to help identify key engineering relationships, similar to those leveraged in change management systems, in product development processes. As part of this work, we review several different semantic relatedness techniques, including a meronomic technique recently introduced by the authors. We introduce an aggregate measure, termed “An Algorithm for Identifying Engineering Relationships in Ontologies,” or AIERO, as a means to purposely quantify semantic relationships within product development frameworks. To assess its consistency and accuracy, AIERO is tested using three separate, independently developed ontologies. The results indicate AIERO is capable of returning consistent rankings of concept pairs across varying knowledge frameworks. A PCB (printed circuit board) case study then highlights AIERO’s unique ability to leverage semantic relationships to systematically narrow where engineering interdependencies are likely to be found between various elements of product development processes

    Identifying Uncertainty in Laser Powder Bed Fusion Additive Manufacturing Models

    Get PDF
    As additive manufacturing (AM) matures, models are beginning to take a more prominent stage in design and process planning. A limitation frequently encountered in AM models is a lack of indication about their precision and accuracy. Often overlooked, model uncertainty is required for validation of AM models, qualification of AM-produced parts, and uncertainty management. This paper presents a discussion on the origin and propagation of uncertainty in laser powder bed fusion (L-PBF) models. Four sources of uncertainty are identified: modeling assumptions, unknown simulation parameters, numerical approximations, and measurement error in calibration data. Techniques to quantify uncertainty in each source are presented briefly, along with estimation algorithms to diminish prediction uncertainty with the incorporation of online measurements. The methods are illustrated with a case study based on a thermal model designed for melt pool width predictions. Model uncertainty is quantified for single track experiments, and the effect of online estimation in overhanging structures is studied via simulation

    Towards Industrial Implementation of Emerging Semantic Technologies

    Get PDF
    Every new design, project, or procedure within a company generates a considerable amount of new information and important knowledge. Furthermore, a tremendous amount of legacy knowledge already exists in companies in electronic and non-electronic formats, and techniques are needed for representing, structuring and reusing this knowledge. Many researchers have spent considerable time and effort developing semantic knowledge management systems, which in theory are presumed to address these problems. Despite significant research investments, little has been done to implement these systems within an industrial setting. In this paper we identify five main requirements to the development of an industry-ready application of semantic knowledge management systems and discuss how each of these can be addressed. These requirements include the ease of new knowledge management software adoption, the incorporation of legacy information, the ease of use of the user interface, the security of the stored information, and the robustness of the software to support multiple file types and allow for the sharing of information across platforms. Collaboration with Raytheon, a defense and aerospace systems company, allowed our team to develop and demonstrate a successful adoption of semantic abilities by a commercial company. Salient features of this work include a new tool, the e-Design MemoExtractor Software Tool, designed to mine and capture company information, a Raytheon-specific extension to the e-Design Framework, and a novel semantic environment in the form of a customized semantic wikiSMW+. The advantages of this approach are discussed in the context of the industrial case study with Raytheon

    Six-Sigma Quality Management of Additive Manufacturing

    Get PDF
    Quality is a key determinant in deploying new processes, products, or services and influences the adoption of emerging manufacturing technologies. The advent of additive manufacturing (AM) as a manufacturing process has the potential to revolutionize a host of enterprise-related functions from production to the supply chain. The unprecedented level of design flexibility and expanded functionality offered by AM, coupled with greatly reduced lead times, can potentially pave the way for mass customization. However, widespread application of AM is currently hampered by technical challenges in process repeatability and quality management. The breakthrough effect of six sigma (6S) has been demonstrated in traditional manufacturing industries (e.g., semiconductor and automotive industries) in the context of quality planning, control, and improvement through the intensive use of data, statistics, and optimization. 6S entails a data-driven DMAIC methodology of five steps—define, measure, analyze, improve, and control. Notwithstanding the sustained successes of the 6S knowledge body in a variety of established industries ranging from manufacturing, healthcare, logistics, and beyond, there is a dearth of concentrated application of 6S quality management approaches in the context of AM. In this article, we propose to design, develop, and implement the new DMAIC methodology for the 6S quality management of AM. First, we define the specific quality challenges arising from AM layerwise fabrication and mass customization (even one-of-a-kind production). Second, we present a review of AM metrology and sensing techniques, from materials through design, process, and environment, to post-build inspection. Third, we contextualize a framework for realizing the full potential of data from AM systems and emphasize the need for analytical methods and tools. We propose and delineate the utility of new data-driven analytical methods, including deep learning, machine learning, and network science, to characterize and model the interrelationships between engineering design, machine setting, process variability, and final build quality. Fourth, we present the methodologies of ontology analytics, design of experiments (DOE), and simulation analysis for AM system improvements. In closing, new process control approaches are discussed to optimize the action plans, once an anomaly is detected, with specific consideration of lead time and energy consumption. We posit that this work will catalyze more in-depth investigations and multidisciplinary research efforts to accelerate the application of 6S quality management in AM

    Feature-level data fusion for energy consumption analytics in additive manufacturing

    Get PDF
    The issue of Additive Manufacturing (AM) energy consumption is attracting attention in both industry and academia, particularly with the trending adoption of AM technologies in the manufacturing industry. It is crucial to analyze, understand, and manage the energy consumption of AM for better efficiency and sustainability. The energy consumption of AM systems is related to various correlated attributes in different phases of an AM process. Existing studies focus mainly on analyzing the impacts of different processing and material attributes, while factors related to design and working environment have not received the same amount of attention. Such factors involve features with various dimensions and nested structures that are difficult to handle in the analysis. To tackle these issues, a feature-level data fusion approach is proposed to integrate heterogeneous data to build an AM energy consumption model to uncover energy-relevant information and knowledge. A case study using real-world data collected from a selective laser sintering (SLS) system is presented to validate the proposed approach, and the results indicate that the fusion strategy achieves better performances on energy consumption prediction than the individual ones. Based on the analysis of feature importance, the design-relevant features are found to have significant impacts on AM energy consumption

    Study of Gluon versus Quark Fragmentation in Υggγ\Upsilon\to gg\gamma and e+eqqˉγe^{+}e^{-}\to q\bar{q}\gamma Events at \sqrt{s}=10 GeV

    Full text link
    Using data collected with the CLEO II detector at the Cornell Electron Storage Ring, we determine the ratio R(chrg) for the mean charged multiplicity observed in Upsilon(1S)->gggamma events, to the mean charged multiplicity observed in e+e- -> qqbar gamma events. We find R(chrg)=1.04+/-0.02+/-0.05 for jet-jet masses less than 7 GeV.Comment: 15 pages, postscript file also available through http://w4.lns.cornell.edu/public/CLN

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns
    corecore