330,380 research outputs found
Use of the dynamic simulation to reduce handling complexity in the manufacturing process
Many companies have started using dynamic simulation as full support for their own optimization team to optimize business processes. The 3D visualization can facilitate understanding of the links among processes and their connections. It can significantly contribute to its appropriate implementation. which aims at saving costs, simplifying processes, introducing new or innovated processes, etc. Application field is not significant for the 3D visualization. Predictive simulation can be applied in any process, from storage. logistics, handling, through production line optimization to distribution. The submitted paper deals with the optimization of the production process regarding the reduction of handling demands for the company in the automotive industry. Businesses are currently facing an issue of handling complexity, which has a relatively high cost, depending on the amount of unnecessary and chaotic trips within production processes. It is necessary to modify the charging method in any change of production. This is connected with an increase of non-productive rides. The article introduces the possibility of a variant solution with the possibility to use dynamic simulation as a powerful tool for the process optimization.Web of Science141888
Computational bioseparation process development
With the increase in computational power over the last decades, the use of modeling and simulation in process design for (petro)chemical industry has become common ground. Computational tools like ASPEN are standard in the design and operational analysis of (petro)chemical plants. However, in the bio pharmaceutical field, such modeling and simulation techniques are only recently being investigated for use and (potential) implementation. Being the workhorse of purification in the biopharmaceutical industry, chromatography is a good candidate for this modeling approach. Detailed mechanistic models describing chromatographic separation behavior are available, and software to simulate chromatography is becoming more and more available (i.e. DelftChrom, CADET, etc.). A bioseparation process normally consists of multiple chromatographic and conditioning steps, hence, an extreme large design space needs to be investigated. This may lead to prohibitive simulation times, even on state-of-the-art fast computers, when only mechanistic models are used. This presentation will show the implementation of a hybrid bioseparation process design approach using a combination of mechanistic models, artificial neural networks and high throughput experimentation for process development and optimization of the production of industrial relevant biologicals
Evolutionary Multiobjective Design in Automotive Development
This paper describes the use of evolutionary algorithms to solve multiobjective optimization problems arising at different stages in the automotive design process. The problems considered are black box optimization scenarios: definitions of the decision space and the design objectives are given, together with a procedure to evaluate any decision alternative with regard to the design objectives, e.g., a simulation model. However, no further information about the objective function is available. In order to provide a practical introduction to the use of multiobjective evolutionary algorithms, this article explores the three following case studies: design space exploration of road trains, parameter optimization of adaptive cruise controllers, and multiobjective system identification. In addition, selected research topics in evolutionary multiobjective optimization will be illustrated along with each case study, highlighting the practical relevance of the theoretical results through real-world application examples. The algorithms used in these studies were implemented based on the PISA (Platform and Programming Language Independent Interface for Search Algorithm) framework. Besides helping to structure the presentation of different algorithms in a coherent way, PISA also reduces the implementation effort considerabl
Energy Modeling and Implementation of Complex Building Systems
Complex/dynamic systems and technologies
are gaining traction in architecture, but accurate
analysis and simulation of conflicting dynamic
systems within a building model has yet to be
achieved. Most ideas of analysis and simulation
revolve around a set process: model one instance
of a building (i.e. without changing parameters)
and analyze in a separate program. The use of a
parametric base for analysis/simulation plugins,
as well as an easily manipulatable and responsive
model would not only further the accuracy of
testing the effects of multiple dynamic systems, but
become a new tool that merges model, behavior,
analysis and simulation to strive for efficient
implementation of these technologies and act
as a platform for testing systems’ compensation
for introduced variables (bio-responsiveness,
enviro-responsiveness, manipulability, systemresponsiveness).
My method for testing this system
utilizes Grasshopper, which excels at: providing a
base for parametric plugins linking ‘static’ software,
using data trees for complex behavioral modeling,
and easing the manipulability of a parametric
model. This method for analysis and optimization
would facilitate the efficient implementation of
dynamic/advanced/sustainable technologies in any
number of building typologies
An Investigation and Simulation of Novel Dynamic Routing Methods
Routing in networks is a multi-objective and multi-constraint optimization
problem due to the nature of current networks being highly dynamic environments.
Currently implemented solutions are single metric solutions where instead an optimal
multi-metric solution is needed to solve this problem. This research work
investigates novel multi-metric solutions to this optimization problem. Recently, it is
found that the employment of a natural optimization process called the ant colony
optimization process to the routing problem, resulted in a multi-metric dynamic
solution. Latest research work reported two slightly different implementations of this
employment. Network agents are used to sensor the status of the network and
feedback the network nodes with the necessary information. This is used to update its
routing tables based on the network status. These two implementations differ in the
method (philosophy) used to update the information in the routing tables held by the
network nodes. This research work suggests a new method to update the routing
tables held by the nodes in the network. This done by merging modified versions of
the previous two methods in order to overcome the disadvantages of each.
A discrete event simulation system is built to test the routing method
suggested by this research work together with the previous two routing methods for companson purposes. This simulation system represents a prototype for the
development of a general network simulation tool. It is capable of collecting various
types of simulation statistical data and generating tracing files for detailed studies of
the network and for testing purposes. An expandable structured C-pointer based
implementation is used to code the system.
The system is tested on various networks and the results of the simulation
show improvements on the performance of the network by reducing the overall delay
in the network and increasing throughput. Moreover, the use of the suggested routing
method results in balancing the load in the network
Energy Modeling and Implementation of Complex Building Systems, Pt. 2
Complex/dynamic systems and technologies are gaining traction in architecture, but accurate analysis and simulation of conflicting dynamic systems within a building model has yet to be achieved. Most ideas of analysis and simulation revolve around a set process: model one instance of a building (i.e. without changing parameters) and analyze in a separate program. The use of a parametric base for analysis/simulation plugins, as well as an easily manipulatable and responsive model would not only further the accuracy of testing the effects of multiple dynamic systems, but become a new tool that merges model, behavior, analysis and simulation to strive for efficient implementation of these technologies and act as a platform for testing systems’ compensation for introduced variables (bio-responsiveness, enviro-responsiveness, manipulability, system responsiveness). My method for testing this system utilizes Grasshopper, which excels at: providing a base for parametric plugins linking ‘static’ software, using data trees for complex behavioral modeling, and easing the manipulability of a parametric model. This method for analysis and optimization would facilitate the efficient implementation of dynamic/advanced/sustainable technologies in any number of building typologies
Analyzing SPSA approaches to solve the non-linear non-differentiable problems arising in the assisted calibration of traffic simulation models
Mathematical and simulation models of systems lay at the core of decision support systems, and their role become more critical as more complex is the system object of decision. The decision process usually encompasses the optimization of some utility function that evaluates the performance indicators that measure the impacts of the decisions. An increasing difficulty directly related to the complexity of the system arises when the associated function to be optimized is a not analytical, non-differentiable, non-linear function which can only be evaluated by simulation. Simulation-Optimization techniques are especially suited in these cases and its use is increasing in traffic models, an archetypic case of complex, dynamic systems exhibiting highly stochastic characteristics. In this approach simulation is used to evaluate the objective function, and a non-differentiable optimization technique to solve the optimization problem is used. Simultaneous Perturbation Stochastic Approximation (SPSA) is one of the most popular of these techniques. This thesis analyzes, discusses and presents computational results for the application of this technique to the calibration of a traffic simulation model of a Swedish highway section. Variants of the SPSA, replacing the usual gradient approach by a combination of normalized parameters and penalized objective function, have been proposed in this study due to an exhaustive analysis of the behavior of classical SPSA where problems arose from different magnitude variables. In this work, a varied set of Software environments have been used, combining RStudio for the analysis, Python and MATLAB for the SPSA implementation, AIMSUN as a Traffic Model Simulator, and SQLite for obtaining of simulated data and Tableau for visualizing data and results
Smart detectors for Monte Carlo radiative transfer
Many optimization techniques have been invented to reduce the noise that is
inherent in Monte Carlo radiative transfer simulations. As the typical
detectors used in Monte Carlo simulations do not take into account all the
information contained in the impacting photon packages, there is still room to
optimize this detection process and the corresponding estimate of the surface
brightness distributions. We want to investigate how all the information
contained in the distribution of impacting photon packages can be optimally
used to decrease the noise in the surface brightness distributions and hence to
increase the efficiency of Monte Carlo radiative transfer simulations.
We demonstrate that the estimate of the surface brightness distribution in a
Monte Carlo radiative transfer simulation is similar to the estimate of the
density distribution in an SPH simulation. Based on this similarity, a recipe
is constructed for smart detectors that take full advantage of the exact
location of the impact of the photon packages. Several types of smart
detectors, each corresponding to a different smoothing kernel, are presented.
We show that smart detectors, while preserving the same effective resolution,
reduce the noise in the surface brightness distributions compared to the
classical detectors. The most efficient smart detector realizes a noise
reduction of about 10%, which corresponds to a reduction of the required number
of photon packages (i.e. a reduction of the simulation run time) of 20%. As the
practical implementation of the smart detectors is straightforward and the
additional computational cost is completely negligible, we recommend the use of
smart detectors in Monte Carlo radiative transfer simulations.Comment: 7 pages, 5 figures, accepted for publication in MNRA
Robotic simulation of textile as concrete reinforcement and formwork.
New possibilities of concrete constructions in architecture, the traditional formwork can be gradually replaced by the use of flexible textile. At the same time textile reinforcement combined with fabric formwork, introduces an innovative integrated solution in the fabrication of concrete. Based on a simple understanding of the textile weaving and knitting techniques, this project concentrates on the architectural production and the structural optimization of the textile as both concrete reinforcement and formwork. Furthermore, we present a robotic simulation of the process that develops using a series of computational experiments to research the sequence of weaving and/or knitting. Through the computational process and the design simulations, the research is firmly rooted in analog and digital exploration of material and its implementation in architecture, with particular emphasis on the convergence of robotics and computation. Note that the paper deals mainly with the software and weaving simulation as part of a larger research project, without dealing with the production of physical artefacts
- …