327 research outputs found
The effect of temperature on the fracture mechanism in 2014A1/15vol.%Al2O3 composite
The tensile fracture strain, stress and fracture mode for a discontinuously reinforced aluminum matrix composite, 2014Al/15vol.%Al2O3, were determined and compared with those of the unreinforced matrix material, 2014A1, at various temperatures. Tests were conducted under uniaxial tension at elevated temperatures with a strain rate of 0.1 s-. It was found that the tensile fracture strain as well as fracture stress of the composite were lower than those of the matrix material. The tensile fracture mode changed from transgranular fracture to intergranular fracture between 400 [deg]C and 500 [deg]C for both materials. For the composite, at temperatures below 400 [deg]C the growth and coalescence of voids occurred via a dislocation creep process primarily along the Al---Al2O3 interface. Above 400 [deg]C voids initiated and grew at the Al---Al2O3 interface and grain boundaries via a diffusion creep process. The void growth was found not along the tensile direction but along the Al---Al2O3 interface and grain boundaries, and this resulted in a low fracture strain. A method for determining quantitatively the characteristics of the void initiation and growth is discussed.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/31436/1/0000354.pd
Data production models for the CDF experiment
The data production for the CDF experiment is conducted on a large Linux PC
farm designed to meet the needs of data collection at a maximum rate of 40
MByte/sec. We present two data production models that exploits advances in
computing and communication technology. The first production farm is a
centralized system that has achieved a stable data processing rate of
approximately 2 TByte per day. The recently upgraded farm is migrated to the
SAM (Sequential Access to data via Metadata) data handling system. The software
and hardware of the CDF production farms has been successful in providing large
computing and data throughput capacity to the experiment.Comment: 8 pages, 9 figures; presented at HPC Asia2005, Beijing, China, Nov 30
- Dec 3, 200
Usability and Usefulness of Circularity Indicators for Manufacturing Performance Management
Advances in industrial digitalization present many opportunities for process and product data exploitation in manufacturing, unlocking new systemic measures of performance beyond a single machine, process, facility area and even beyond the factory gates. However, existing data models and manufacturing systems\u27 performance measures are still focused on productivity, quality and delivery time, which could potentially lead to an accelerated linear economy. To shift to more circular industrial systems, we need to identify and assess circularity opportunities in ways that align the goals of sustainable and industrial development. In this study, micro-level circular indicators were reviewed, selected, analysed and tested in a manufacturing company to evaluate their usability and usefulness to guide process improvements. The aim is to enable circular and eco-efficient solutions towards sustainable production systems. Usability and usefulness of the indicators are essential to their integration into established environmental and operations management systems. The main contribution of this study is in the identification of key features making circularity indicators usable and useful from a manufacturer\u27s perspective. The conclusion also suggests directions for further research on tools and methods to support circular manufacturing
Data processing model for the CDF experiment
The data processing model for the CDF experiment is described. Data
processing reconstructs events from parallel data streams taken with different
combinations of physics event triggers and further splits the events into
datasets of specialized physics datasets. The design of the processing control
system faces strict requirements on bookkeeping records, which trace the status
of data files and event contents during processing and storage. The computing
architecture was updated to meet the mass data flow of the Run II data
collection, recently upgraded to a maximum rate of 40 MByte/sec. The data
processing facility consists of a large cluster of Linux computers with data
movement managed by the CDF data handling system to a multi-petaByte Enstore
tape library. The latest processing cycle has achieved a stable speed of 35
MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and
data-handling capacity as required.Comment: 12 pages, 10 figures, submitted to IEEE-TN
Hierarchical colour image segmentation by leveraging RGB channels independently
In this paper, we introduce a hierarchical colour image segmentation based on cuboid partitioning using simple statistical features of the pixel intensities in the RGB channels. Estimating the difference between any two colours is a challenging task. As most of the colour models are not perceptually uniform, investigation of an alternative strategy is highly demanding. To address this issue, for our proposed technique, we present a new concept for colour distance measure based on the inconsistency of pixel intensities of an image which is more compliant to human perception. Constructing a reliable set of superpixels from an image is fundamental for further merging. As cuboid partitioning is a superior candidate to produce superpixels, we use the agglomerative merging to yield the final segmentation results exploiting the outcome of our proposed cuboid partitioning. The proposed cuboid segmentation based algorithm significantly outperforms not only the quadtree-based segmentation but also existing state-of-the-art segmentation algorithms in terms of quality of segmentation for the benchmark datasets used in image segmentation. © 2019, Springer Nature Switzerland AG
- âŠ