830 research outputs found

    CFD modeling in Industry 4.0: New perspectives for smart factories

    Get PDF
    Abstract Industrial market is becoming increasingly competitive and companies need even more advanced resources to advantage over competitors. As an example, simulation is part of Industry 4.0 technologies and a key tool for lay out re-configuration, in order to realize a flexible product customization but also to optimize manufacturing processes. For these reasons Computational Fluid Dynamics (CFD) simulation can determine a competitive advantage for smart factories in the light of possibilities offered by new technologies. The research is focused on a conceptual solution to integrate CFD simulation with technologies of the Industry 4.0, in order to open new opportunities for companies in terms of in terms of growth and competitiveness

    On nonlocally interacting metrics, and a simple proposal for cosmic acceleration

    Full text link
    We propose a simple, nonlocal modification to general relativity (GR) on large scales, which provides a model of late-time cosmic acceleration in the absence of the cosmological constant and with the same number of free parameters as in standard cosmology. The model is motivated by adding to the gravity sector an extra spin-2 field interacting nonlocally with the physical metric coupled to matter. The form of the nonlocal interaction is inspired by the simplest form of the Deser-Woodard (DW) model, αR1R\alpha R\frac{1}{\Box}R, with one of the Ricci scalars being replaced by a constant m2m^{2}, and gravity is therefore modified in the infrared by adding a simple term of the form m21Rm^2\frac{1}{\Box}R to the Einstein-Hilbert term. We study cosmic expansion histories, and demonstrate that the new model can provide background expansions consistent with observations if mm is of the order of the Hubble expansion rate today, in contrast to the simple DW model with no viable cosmology. The model is best fit by w01.075w_0\sim-1.075 and wa0.045w_a\sim0.045. We also compare the cosmology of the model to that of Maggiore and Mancarella (MM), m2R12Rm^2R\frac{1}{\Box^2}R, and demonstrate that the viable cosmic histories follow the standard-model evolution more closely compared to the MM model. We further demonstrate that the proposed model possesses the same number of physical degrees of freedom as in GR. Finally, we discuss the appearance of ghosts in the local formulation of the model, and argue that they are unphysical and harmless to the theory, keeping the physical degrees of freedom healthy.Comment: 47 pages in JCAP style, 7 figures. Some discussions extended in response to referee's comments. Version accepted for publication in JCA

    Tour recommendation for groups

    Get PDF
    Consider a group of people who are visiting a major touristic city, such as NY, Paris, or Rome. It is reasonable to assume that each member of the group has his or her own interests or preferences about places to visit, which in general may differ from those of other members. Still, people almost always want to hang out together and so the following question naturally arises: What is the best tour that the group could perform together in the city? This problem underpins several challenges, ranging from understanding people’s expected attitudes towards potential points of interest, to modeling and providing good and viable solutions. Formulating this problem is challenging because of multiple competing objectives. For example, making the entire group as happy as possible in general conflicts with the objective that no member becomes disappointed. In this paper, we address the algorithmic implications of the above problem, by providing various formulations that take into account the overall group as well as the individual satisfaction and the length of the tour. We then study the computational complexity of these formulations, we provide effective and efficient practical algorithms, and, finally, we evaluate them on datasets constructed from real city data

    Simple Dynamics for Plurality Consensus

    Get PDF
    We study a \emph{Plurality-Consensus} process in which each of nn anonymous agents of a communication network initially supports an opinion (a color chosen from a finite set [k][k]). Then, in every (synchronous) round, each agent can revise his color according to the opinions currently held by a random sample of his neighbors. It is assumed that the initial color configuration exhibits a sufficiently large \emph{bias} ss towards a fixed plurality color, that is, the number of nodes supporting the plurality color exceeds the number of nodes supporting any other color by ss additional nodes. The goal is having the process to converge to the \emph{stable} configuration in which all nodes support the initial plurality. We consider a basic model in which the network is a clique and the update rule (called here the \emph{3-majority dynamics}) of the process is the following: each agent looks at the colors of three random neighbors and then applies the majority rule (breaking ties uniformly). We prove that the process converges in time O(min{k,(n/logn)1/3}logn)\mathcal{O}( \min\{ k, (n/\log n)^{1/3} \} \, \log n ) with high probability, provided that scmin{2k,(n/logn)1/3}nlogns \geqslant c \sqrt{ \min\{ 2k, (n/\log n)^{1/3} \}\, n \log n}. We then prove that our upper bound above is tight as long as k(n/logn)1/4k \leqslant (n/\log n)^{1/4}. This fact implies an exponential time-gap between the plurality-consensus process and the \emph{median} process studied by Doerr et al. in [ACM SPAA'11]. A natural question is whether looking at more (than three) random neighbors can significantly speed up the process. We provide a negative answer to this question: In particular, we show that samples of polylogarithmic size can speed up the process by a polylogarithmic factor only.Comment: Preprint of journal versio

    Closing the gap: Exact maximum likelihood training of generative autoencoders using invertible layers

    Full text link
    In this work, we provide an exact likelihood alternative to the variational training of generative autoencoders. We show that VAE-style autoencoders can be constructed using invertible layers, which offer a tractable exact likelihood without the need for any regularization terms. This is achieved while leaving complete freedom in the choice of encoder, decoder and prior architectures, making our approach a drop-in replacement for the training of existing VAEs and VAE-style models. We refer to the resulting models as Autoencoders within Flows (AEF), since the encoder, decoder and prior are defined as individual layers of an overall invertible architecture. We show that the approach results in strikingly higher performance than architecturally equivalent VAEs in term of log-likelihood, sample quality and denoising performance. In a broad sense, the main ambition of this work is to close the gap between the normalizing flow and autoencoder literature under the common framework of invertibility and exact maximum likelihood

    On the Fundamental Periods of Vibration of Flat-Bottom Ground-Supported Circular Silos containing Gran-like Material

    Get PDF
    Despite the significant amount of research effort devoted to understanding the structural behavior of grain-silos, each year a large number of silos still fails due to bad design, poor construction, with a frequency much larger than other civil structures. In particular, silos frequently fails during large earthquakes, as occurred during the 1999 Chi-Chi, Taiwan earthquake when almost all the silos located in Taichung Port, 70 km far from the epicenter, collapsed. The EQE report stated that "the seismic design of practice that is used for the design and construction of such facilities clearly requires a major revision". The fact indicates that actual design procedures have limits and therefore significant advancements in the knowledge of the structural behavior of silo structures are still necessary. The present work presents an analytical formulation for the assessment of the natural periods of grain silos. The predictions of the novel formulation are compared with experimental findings and numerical simulations

    Systematic Human Reliability Analysis (SHRA): A New Approach to Evaluate Human Error Probability (HEP) in a Nuclear Plant

    Get PDF
    Emergency management in industrial plants is a fundamental issue to ensure the safety of operators. The emergency management analyses two fundamental aspects: the system reliability and the human reliability. System reliability is the capability of ensuring the functional properties within a variability of work conditions, considering the possible deviations due to unexpected events. However, system reliability is strongly related to the reliability of its weakest component. The complexity of the processes could generate incidental situations and the worker appears (human reliability) to be the weakest part of the whole system. The complexity of systems influences operator's ability to take decisions during emergencies. The aim of the present research is to develop a new approach to evaluate human error probability (HEP), called Systematic Human Reliability Analysis (SHRA). The proposed approach considers internal and external factors that affect operator's ability. The new approach is based on Nuclear Action Reliability Assessment (NARA), Simplified Plant Analysis Risk Human Reliability (SPAR-H) and on the Performance Shaping Factors (PSFs) relationship. The present paper analysed some shortcomings related to literature approaches, especially the limitations of the working time. We estimated HEP, after 8 hours (work standard) during emergency conditions. The correlations between the advantages of these three methodologies allows proposing a HEP analysis during accident scenarios emergencies. SHRA can be used to estimate human reliability during emergencies. SHRA has been applied in a nuclear accident scenario, considering 24 hours of working time. The SHRA results highlight the most important internal and external factors that affect operator's ability

    Toward coherent space–time mapping of seagrass cover from satellite data: an example of a Mediterranean lagoon

    Get PDF
    Seagrass meadows are a highly productive and economically important shallow coastal habitat. Their sensitivity to natural and anthropogenic disturbances, combined with their importance for local biodiversity, carbon stocks, and sediment dynamics, motivate a frequent monitoring of their distribution. However, generating time series of seagrass cover from field observations is costly, and mapping methods based on remote sensing require restrictive conditions on seabed visibility, limiting the frequency of observations. In this contribution, we examine the effect of accounting for environmental factors, such as the bathymetry and median grain size (D50) of the substrate as well as the coordinates of known seagrass patches, on the performance of a random forest (RF) classifier used to determine seagrass cover. Using 148 Landsat images of the Venice Lagoon (Italy) between 1999 and 2020, we trained an RF classifier with only spectral features from Landsat images and seagrass surveys from 2002 and 2017. Then, by adding the features above and applying a time-based correction to predictions, we created multiple RF models with different feature combinations. We tested the quality of the resulting seagrass cover predictions from each model against field surveys, showing that bathymetry, D50, and coordinates of known patches exert an influence that is dependent on the training Landsat image and seagrass survey chosen. In models trained on a survey from 2017, where using only spectral features causes predictions to overestimate seagrass surface area, no significant change in model performance was observed. Conversely, in models trained on a survey from 2002, the addition of the out-of-image features and particularly coordinates of known vegetated patches greatly improves the predictive capacity of the model, while still allowing the detection of seagrass beds absent in the reference field survey. Applying a time-based correction eliminates small temporal variations in predictions, improving predictions that performed well before correction. We conclude that accounting for the coordinates of known seagrass patches, together with applying a time-based correction, has the most potential to produce reliable frequent predictions of seagrass cover. While this case study alone is insufficient to explain how geographic location information influences the classification process, we suggest that it is linked to the inherent spatial auto-correlation of seagrass meadow distribution. In the interest of improving remote-sensing classification and particularly to develop our capacity to map vegetation across time, we identify this phenomenon as warranting further research.</p

    Hydrides as high capacity anodes in lithium cells: an Italian “Futuro in Ricerca di Base FIRB-2010” project

    Get PDF
    Automotive and stationary energy storage are among the most recently-proposed and still unfulfilled applications for lithium ion devices. Higher energy, power and superior safety standards, well beyond the present state of the art, are actually required to extend the Li-ion battery market to these challenging fields, but such a goal can only be achieved by the development of new materials with improved performances. Focusing on the negative electrode materials, alloying and conversion chemistries have been widely explored in the last decade to circumvent the main weakness of the intercalation processes: the limitation in capacity to one or at most two lithium atoms per host formula unit. Among all of the many proposed conversion chemistries, hydrides have been proposed and investigated since 2008. In lithium cells, these materials undergo a conversion reaction that gives metallic nanoparticles surrounded by an amorphous matrix of LiH. Among all of the reported conversion materials, hydrides have outstanding theoretical properties and have been only marginally explored, thus making this class of materials an interesting playground for both fundamental and applied research. In this review, we illustrate the most relevant results achieved in the frame of the Italian National Research Project FIRB 2010 Futuro in Ricerca “Hydrides as high capacity anodes in lithium cells” and possible future perspectives of research for this class of materials in electrochemical energy storage devices
    corecore