512,404 research outputs found
Revised self-consistent continuum solvation in electronic-structure calculations
The solvation model proposed by Fattebert and Gygi [Journal of Computational
Chemistry 23, 662 (2002)] and Scherlis et al. [Journal of Chemical Physics 124,
074103 (2006)] is reformulated, overcoming some of the numerical limitations
encountered and extending its range of applicability. We first recast the
problem in terms of induced polarization charges that act as a direct mapping
of the self-consistent continuum dielectric; this allows to define a functional
form for the dielectric that is well behaved both in the high-density region of
the nuclear charges and in the low-density region where the electronic
wavefunctions decay into the solvent. Second, we outline an iterative procedure
to solve the Poisson equation for the quantum fragment embedded in the solvent
that does not require multi-grid algorithms, is trivially parallel, and can be
applied to any Bravais crystallographic system. Last, we capture some of the
non-electrostatic or cavitation terms via a combined use of the quantum volume
and quantum surface [Physical Review Letters 94, 145501 (2005)] of the solute.
The resulting self-consistent continuum solvation (SCCS) model provides a very
effective and compact fit of computational and experimental data, whereby the
static dielectric constant of the solvent and one parameter allow to fit the
electrostatic energy provided by the PCM model with a mean absolute error of
0.3 kcal/mol on a set of 240 neutral solutes. Two parameters allow to fit
experimental solvation energies on the same set with a mean absolute error of
1.3 kcal/mol. A detailed analysis of these results, broken down along different
classes of chemical compounds, shows that several classes of organic compounds
display very high accuracy, with solvation energies in error of 0.3-0.4
kcal/mol, whereby larger discrepancies are mostly limited to self-dissociating
species and strong hydrogen-bond forming compounds.Comment: The following article has been accepted by The Journal of Chemical
Physics. After it is published, it will be found at
http://link.aip.org/link/?jcp
Genetic algorithms for the scheduling in additive manufacturing
[EN] Genetic Algorithms (GAs) are introduced to tackle the packing problem. The scheduling in Additive Manufacturing (AM) is also dealt with to set up a managed market, called “Lonja3D”. This will enable to determine an alternative tool through the combinatorial auctions, wherein the customers will be able to purchase the products at the best prices from the manufacturers. Moreover, the manufacturers will be able to optimize the production capacity and to decrease the operating costs in each case.This research has been partially financed by the project: “Lonja de Impresión 3D para la Industria
4.0 y la Empresa Digital (LONJA3D)” funded by the Regional Government of Castile and Leon and
the European Regional Development Fund (ERDF, FEDER) with grant VA049P17Castillo-Rivera, S.; De Antón, J.; Del Olmo, R.; Pajares, J.; López-Paredes, A. (2020). Genetic algorithms for the scheduling in additive manufacturing. International Journal of Production Management and Engineering. 8(2):59-63. https://doi.org/10.4995/ijpme.2020.12173OJS596382Ahsan, A., Habib, A., Khoda, B. (2015). Resource based process planning for additive manufacturing. Computer-Aided Design, 69, 112-125. https://doi.org/10.1016/j.cad.2015.03.006Araújo, L., Özcan, E., Atkin, J., Baumers, M., Tuck, C., Hague, R. (2015). Toward better build volume packing in additive manufacturing: classification of existing problems and benchmarks. 26th Annual International Solid Freeform Fabrication Symposium - an Additive Manufacturing Conference, 401-410.Berman, B. (2012). 3-D printing: The new industrial revolution. Business Horizons, 55: 155-162. https://doi.org/10.1016/j.bushor.2011.11.003Canellidis, V., Dedoussis, V., Mantzouratos, N., Sofianopoulou, S. (2006). Preprocessing methodology for optimizing stereolithography apparatus build performance. Computers in Industry, 57, 424-436. https://doi.org/10.1016/j.compind.2006.02.004Chergui, A., Hadj-Hamoub, K., Vignata, F. (2018). Production scheduling and nesting in additive manufacturing. Computers & Industrial Engineering, 126, 292-301. https://doi.org/10.1016/j.cie.2018.09.048Demirel, E., Özelkan, E.C., Lim, C. (2018). Aggregate planning with flexibility requirements profile. International Journal of Production Economics, 202, 45-58. https://doi.org/10.1016/j.ijpe.2018.05.001Fera, M., Fruggiero, F., Lambiase, A., Macchiaroli, R., Todisco, V. (2018). A modified genetic algorithm for time and cost optimization of an additive manufacturing single-machine scheduling. International Journal of Industrial Engineering Computations, 9, 423-438. https://doi.org/10.5267/j.ijiec.2018.1.001Hopper, E., Turton, B. (1997). Application of genetic algorithms to packing problems - A Review. Proceedings of the 2nd Online World Conference on Soft Computing in Engineering Design and Manufacturing, Springer Verlag, London, 279-288. https://doi.org/10.1007/978-1-4471-0427-8_30Ikonen, I., Biles, W.E., Kumar, A., Wissel, J.C., Ragade, R.K. (1997). A genetic algorithm for packing three-dimensional non-convex objects having cavities and holes. ICGA, 591-598.Kim, K.H., Egbelu, P.J. (1999). Scheduling in a production environment with multiple process plans per job. International Journal of Production Research, 37, 2725-2753. https://doi.org/10.1080/002075499190491Lawrynowicz, A. (2011). Genetic algorithms for solving scheduling problems in manufacturing systems. Foundations of Management, 3(2), 7-26. https://doi.org/10.2478/v10238-012-0039-2Li, Q., Kucukkoc, I., Zhang, D. (2017). Production planning in additive manufacturing and 3D printing. Computers and Operations Research, 83, 157-172. https://doi.org/10.1016/j.cor.2017.01.013Milošević, M., Lukić, D., Đurđev, M., Vukman, J., Antić, A. (2016). Genetic Algorithms in Integrated Process Planning and Scheduling-A State of The Art Review. Proceedings in Manufacturing Systems, 11(2), 83-88.Pour, M.A., Zanardini, M., Bacchetti, A., Zanoni, S. (2016). Additive manufacturing impacts on productions and logistics systems. IFAC, 49(12), 1679-1684. https://doi.org/10.1016/j.ifacol.2016.07.822Wilhelm, W.E., Shin, H.M. (1985). Effectiveness of Alternate Operations in a Flexible Manufacturing System. International Journal of Production Research, 23(1), 65-79. https://doi.org/10.1080/00207548508904691Xirouchakis, P., Kiritsis, D., Persson, J.G. (1998). A Petri net Technique for Process Planning Cost Estimation. Annals of the CIRP, 47(1), 427-430. https://doi.org/10.1016/S0007-8506(07)62867-4Zhang, Y., Bernard, A., Gupta, R.K., Harik, R. (2014). Evaluating the design for additive manufacturing: a process planning perspective. Procedia CIRP, 21, 144-150. https://doi.org/10.1016/j.procir.2014.03.17
Novel Artificial Human Optimization Field Algorithms - The Beginning
New Artificial Human Optimization (AHO) Field Algorithms can be created from
scratch or by adding the concept of Artificial Humans into other existing
Optimization Algorithms. Particle Swarm Optimization (PSO) has been very
popular for solving complex optimization problems due to its simplicity. In
this work, new Artificial Human Optimization Field Algorithms are created by
modifying existing PSO algorithms with AHO Field Concepts. These Hybrid PSO
Algorithms comes under PSO Field as well as AHO Field. There are Hybrid PSO
research articles based on Human Behavior, Human Cognition and Human Thinking
etc. But there are no Hybrid PSO articles which based on concepts like Human
Disease, Human Kindness and Human Relaxation. This paper proposes new AHO Field
algorithms based on these research gaps. Some existing Hybrid PSO algorithms
are given a new name in this work so that it will be easy for future AHO
researchers to find these novel Artificial Human Optimization Field Algorithms.
A total of 6 Artificial Human Optimization Field algorithms titled "Human
Safety Particle Swarm Optimization (HuSaPSO)", "Human Kindness Particle Swarm
Optimization (HKPSO)", "Human Relaxation Particle Swarm Optimization (HRPSO)",
"Multiple Strategy Human Particle Swarm Optimization (MSHPSO)", "Human Thinking
Particle Swarm Optimization (HTPSO)" and "Human Disease Particle Swarm
Optimization (HDPSO)" are tested by applying these novel algorithms on Ackley,
Beale, Bohachevsky, Booth and Three-Hump Camel Benchmark Functions. Results
obtained are compared with PSO algorithm.Comment: 25 pages, 41 figure
Multiangle social network recommendation algorithms and similarity network evaluation
Multiangle social network recommendation algorithms (MSN) and a new assessmentmethod, called similarity network evaluation (SNE), are both proposed. From the viewpoint of six dimensions, the MSN are classified into six algorithms, including user-based algorithmfromresource point (UBR), user-based algorithmfromtag point (UBT), resource-based algorithm fromtag point (RBT), resource-based algorithm from user point (RBU), tag-based algorithm from resource point (TBR), and tag-based algorithm from user point (TBU). Compared with the traditional recall/precision (RP) method, the SNE is more simple, effective, and visualized. The simulation results show that TBR and UBR are the best algorithms, RBU and TBU are the worst ones, and UBT and RBT are in the medium levels
New numerical approaches for modeling thermochemical convection in a compositionally stratified fluid
Seismic imaging of the mantle has revealed large and small scale
heterogeneities in the lower mantle; specifically structures known as large low
shear velocity provinces (LLSVP) below Africa and the South Pacific. Most
interpretations propose that the heterogeneities are compositional in nature,
differing in composition from the overlying mantle, an interpretation that
would be consistent with chemical geodynamic models. Numerical modeling of
persistent compositional interfaces presents challenges, even to
state-of-the-art numerical methodology. For example, some numerical algorithms
for advecting the compositional interface cannot maintain a sharp compositional
boundary as the fluid migrates and distorts with time dependent fingering due
to the numerical diffusion that has been added in order to maintain the upper
and lower bounds on the composition variable and the stability of the advection
method. In this work we present two new algorithms for maintaining a sharper
computational boundary than the advection methods that are currently openly
available to the computational mantle convection community; namely, a
Discontinuous Galerkin method with a Bound Preserving limiter and a
Volume-of-Fluid interface tracking algorithm. We compare these two new methods
with two approaches commonly used for modeling the advection of two distinct,
thermally driven, compositional fields in mantle convection problems; namely,
an approach based on a high-order accurate finite element method advection
algorithm that employs an artificial viscosity technique to maintain the upper
and lower bounds on the composition variable as well as the stability of the
advection algorithm and the advection of particles that carry a scalar quantity
representing the location of each compositional field. All four of these
algorithms are implemented in the open source FEM code ASPECT
EPiK-a Workflow for Electron Tomography in Kepler.
Scientific workflows integrate data and computing interfaces as configurable, semi-automatic graphs to solve a scientific problem. Kepler is such a software system for designing, executing, reusing, evolving, archiving and sharing scientific workflows. Electron tomography (ET) enables high-resolution views of complex cellular structures, such as cytoskeletons, organelles, viruses and chromosomes. Imaging investigations produce large datasets. For instance, in Electron Tomography, the size of a 16 fold image tilt series is about 65 Gigabytes with each projection image including 4096 by 4096 pixels. When we use serial sections or montage technique for large field ET, the dataset will be even larger. For higher resolution images with multiple tilt series, the data size may be in terabyte range. Demands of mass data processing and complex algorithms require the integration of diverse codes into flexible software structures. This paper describes a workflow for Electron Tomography Programs in Kepler (EPiK). This EPiK workflow embeds the tracking process of IMOD, and realizes the main algorithms including filtered backprojection (FBP) from TxBR and iterative reconstruction methods. We have tested the three dimensional (3D) reconstruction process using EPiK on ET data. EPiK can be a potential toolkit for biology researchers with the advantage of logical viewing, easy handling, convenient sharing and future extensibility
- …