2,931 research outputs found
Parametric Analysis of Particle Spreading with Discrete Element Method
The spreading of metallic powder on the printing platform is vital in most additive manufacturing methods, including direct laser sintering. Several processing parameters such as particle size, inter-particle friction, blade speed, and blade gap size affect the spreading process and, therefore, the final product quality. The objective of this study is to parametrically analyze the particle flow behavior and the effect of the aforementioned parameters on the spreading process using the discrete element method (DEM). To effectively address the vast parameter space within computational constraints, novel parameter sweep algorithms based on low discrepancy sequence (LDS) are utilized in conjunction with parallel computing. Based on the parametric analysis, optimal material properties and machine setup are proposed for a higher quality spreading. Modeling suggests that lower friction, smaller particle size, lower blade speed, and a gap of two times the particle diameter result in a higher quality spreading process. In addition, a twoparameter Weibull distribution is adopted to investigate the influence of particle size distribution. The result suggests that smaller particles with a narrower distribution produce a higher-quality flow, with a proper selection of gap. Finally, parallel computing, in conjunction with the LDS parameter sweep algorithm, effectively shrinks the parameter space and improves the overall computational efficiency
Recommended from our members
Computational fluid dynamics modelling of a polymer electrolyte membrane fuel cell under transient automotive operations
A polymer electrolyte membrane (PEM) fuel cell is probably the most promising technology that will replace conventional internal combustion engines in the near future. As a primary power source for an automobile, the transient performance of a PEM fuel cell is of prime importance. In this thesis, a comprehensive, three-dimensional, two-phase, multi-species computational fuel cell dynamics model is developed in order to investigate the effect of flow-field design on the magnitude of current overshoot/undershoot and characteristics of current response when the cell is subjected to different voltage change patterns representing an automotive operation.
The meshing strategy specific to PEM fuel cell modelling is studied in a systematic manner and employed in all analyses presented in this thesis. The predicted results compare very well with experimental data under both steady-state and transient operations. Two computational domains are used – the straight single-channel and practical-scale square cells with parallel, single-serpentine, and triple-serpentine flow-fields.
The results from the straight single-channel cell suggest that the magnitude of current overshoot/undershoot increases with the voltage change rate. The behaviour of a current response curve is the result of complex interplay between water content at both sides of the membrane. It is also found that current overshoot/undershoot is amplified with the presence water flooding in the cell. The results from the square cell reveal that current overshoot/undershoot is caused by non-uniformity of local current density over the active area confirming the effect of flow-field geometry on transient response of the cell. By comparing the transient performance between the three flow-fields, a direct relationship between degree of water flooding in the cell and magnitude of current overshoot/undershoot has been found. A conclusion has been drawn which states that a cell with superior water removal ability will experience smaller current overshoot/undershoot
Survey and Analysis of Production Distributed Computing Infrastructures
This report has two objectives. First, we describe a set of the production
distributed infrastructures currently available, so that the reader has a basic
understanding of them. This includes explaining why each infrastructure was
created and made available and how it has succeeded and failed. The set is not
complete, but we believe it is representative.
Second, we describe the infrastructures in terms of their use, which is a
combination of how they were designed to be used and how users have found ways
to use them. Applications are often designed and created with specific
infrastructures in mind, with both an appreciation of the existing capabilities
provided by those infrastructures and an anticipation of their future
capabilities. Here, the infrastructures we discuss were often designed and
created with specific applications in mind, or at least specific types of
applications. The reader should understand how the interplay between the
infrastructure providers and the users leads to such usages, which we call
usage modalities. These usage modalities are really abstractions that exist
between the infrastructures and the applications; they influence the
infrastructures by representing the applications, and they influence the ap-
plications by representing the infrastructures
Learning Scheduling Algorithms for Data Processing Clusters
Efficiently scheduling data processing jobs on distributed compute clusters
requires complex algorithms. Current systems, however, use simple generalized
heuristics and ignore workload characteristics, since developing and tuning a
scheduling policy for each workload is infeasible. In this paper, we show that
modern machine learning techniques can generate highly-efficient policies
automatically. Decima uses reinforcement learning (RL) and neural networks to
learn workload-specific scheduling algorithms without any human instruction
beyond a high-level objective such as minimizing average job completion time.
Off-the-shelf RL techniques, however, cannot handle the complexity and scale of
the scheduling problem. To build Decima, we had to develop new representations
for jobs' dependency graphs, design scalable RL models, and invent RL training
methods for dealing with continuous stochastic job arrivals. Our prototype
integration with Spark on a 25-node cluster shows that Decima improves the
average job completion time over hand-tuned scheduling heuristics by at least
21%, achieving up to 2x improvement during periods of high cluster load
Venice California, Gentrification in a Neo-Bohemian Beach Town: Structural violence, Powerstructures and Ideological Justification for Injustice, Exclusion and Dispossesion
Postponed access: the file will be accessible after 2022-08-07MasteroppgaveSANT350MASV-SAN
Pathways to poverty:Theoretical and empirical analyses
The prevalence of poverty in advanced economies represents a challenge, both to economic theory and to society. We know that poverty is perpetuated by low levels of educational investment amongst disadvantaged children, but we have no credible theoretical explanation for the observed degree of that apparent underinvestment, and we have not yet developed sufficient policy tools to break the intergenerational cycle of deprivation. In response, this thesis undertakes theoretical and empirical analyses of the pathways that perpetuate poverty. I demonstrate that divergently low educational investment could arise as an equilibrium response to a grades-focussed educational system; I develop the existing state-of-the-art technique in econometric estimation of the educational production function; and I apply that technique to find strong empirical support for my theoretical model. In addition my results show that the average child’s propensity to think analytically has a substantial influence over their developmental pathway, which suggests that models of educational investment should adopt a generalisation of Expected Utility Theory that allows agents to maximise one of two possible objective functions
- …