113,120 research outputs found
Evaluation methods for improving surface geometry of concrete floors. A case study
Among various construction activities, related to concrete pavement technologies, an important role is reserved to industrial floors. For these structures it is necessary to ensure resistance and stability, durability, reliability and many other properties. In particular, the flatness is a special requirement that assumes a real significance respect to functional performances, especially when the pavement has to allow the movement of vehicles and goods or the storage in elevated stacks or shelves. The flatness can be defined in different ways, but in every cases it is referred to pavement surface geometry, that has to be even (without superelevated or depressed areas) and level (horizontal, without grades, curvatures and waves). The acceptance limits are defined by technical standards, in various Countries, together with the suitable methods for measurements and controls. In many cases, however, these methods are considered not really feasible or easy, in particular when a continuous sampling of the pavement, along selected alignments, is needed. In particular, the paper describes the operating procedures to calculate indexes FF and FL, according to ASTM 1155M standard, starting from data provided by a contact profilometer. If the target values are not reach, it is necessary to provide some alternative solutions to avoid the demolition of the slabs or the payment of penalties by the builder, if this is required by the contract. There are two main possible methods for increasing flatness and levelness while other functional surface properties are maintained at the expected levels: the surface grinding and the overlapping with self-levelling and high resistance resins. A case study where the two alternative methods are applied to improve flatness and levelness of a surface is presented. The results of measures made before and after the treatments showed that both the solutions are able to ensure, within certain limits, the fulfillment of the requirements and consequently they can be used for the proposed aims
Cohesive properties of noble metals by van der Waals-corrected Density Functional Theory
The cohesive energy, equilibrium lattice constant, and bulk modulus of noble
metals are computed by different van der Waals-corrected Density Functional
Theory methods, including vdW-DF, vdW-DF2, vdW-DF-cx, rVV10 and PBE-D. Two
specifically-designed methods are also developed in order to effectively
include dynamical screening effects: the DFT/vdW-WF2p method, based on the
generation of Maximally Localized Wannier Functions, and the RPAp scheme (in
two variants), based on a single-oscillator model of the localized electron
response. Comparison with results obtained without explicit inclusion of van
der Waals effects, such as with the LDA, PBE, PBEsol, or the hybrid PBE0
functional, elucidates the importance of a suitable description of screened van
der Waals interactions even in the case of strong metal bonding. Many-body
effects are also quantitatively evaluated within the RPAp approach.Comment: 3 figure
Improved modelling of liquid GeSe: the impact of the exchange-correlation functional
The structural properties of liquid GeSe are studied by using
first-principles molecular dynamics in conjuncton with the Becke, Lee, Yang and
Parr (BLYP) generalized gradient approximation for the exchange and correlation
energy. The results on partial pair correlation functions, coordination
numbers, bond angle distributions and partial structure factors are compared
with available experimental data and with previous first-principle molecular
dynamics results obtained within the Perdew and Wang (PW) generalized gradient
approximation for the exchange and correlation energy. We found that the BLYP
approach substantially improves upon the PW one in the case of the short-range
properties. In particular, the GeGe pair correlation function takes a more
structured profile that includes a marked first peak due to homopolar bonds, a
first maximum exhibiting a clear shoulder and a deep minimum, all these
features being absent in the previous PW results. Overall, the amount of
tetrahedral order is significantly increased, in spite of a larger number of
GeGe homopolar connections. Due to the smaller number of miscoordinations,
diffusion coefficients obtained by the present BLYP calculation are smaller by
at least one order of magnitude than in the PW case.Comment: 6 figure
Tailoring Dielectric Properties of Multilayer Composites Using Spark Plasma Sintering
A straightforward and simple way to produce well-densified ferroelectric ceramic composites with a full control of both architecture and properties using spark plasma sintering (SPS) is proposed. SPS main outcome is indeed to obtain high densification at relatively low temperatures and short treatment times thus limiting interdiffusion in multimaterials. Ferroelectric/dielectric (BST64/MgO/BST64) multilayer ceramic densified at 97% was obtained, with unmodified Curie temperature, a stack dielectric constant reaching 600, and dielectric losses dropping down to 0.5%, at room-temperature. This result ascertains SPS as a relevant tool for the design of functional materials with tailored properties
Applicability of semi-supervised learning assumptions for gene ontology terms prediction
Gene Ontology (GO) is one of the most important resources in bioinformatics, aiming to provide a unified framework for the biological annotation of genes and proteins across all species. Predicting GO terms is an essential task for bioinformatics, but the number of available labelled proteins is in several cases insufficient for training reliable machine learning classifiers. Semi-supervised learning methods arise as a powerful solution that explodes the information contained in unlabelled data in order to improve the estimations of traditional supervised approaches. However, semi-supervised learning methods have to make strong assumptions about the nature of the training data and thus, the performance of the predictor is highly dependent on these assumptions. This paper presents an analysis of the applicability of semi-supervised learning assumptions over the specific task of GO terms prediction, focused on providing judgment elements that allow choosing the most suitable tools for specific GO terms. The results show that semi-supervised approaches significantly outperform the traditional supervised methods and that the highest performances are reached when applying the cluster assumption. Besides, it is experimentally demonstrated that cluster and manifold assumptions are complimentary to each other and an analysis of which GO terms can be more prone to be correctly predicted with each assumption, is provided.Postprint (published version
Integration of end-of-life options as a design criterion in methods and tools for ecodesign
Ecodesigning a product consists (amongst other things) in assessing what its environmental impacts will be throughout its life (that is to say from its design phase to its end of life), in order to limit them. Some tools and methods exist to (eco)design a product, just like methods that assess its environmental impacts (more often, a posteriori). But it is now well accepted that these are the early design decisions that will initiate the greatest consequences on the product’s end-of-life options and their impacts. Thus, the present work aims at analysing traditional design tools, so as to integrate end-of-life possibilities in the form of recommendations for the design step. This proposal will be illustrated by means of a wind turbine design.EcoSD networ
Support vector machine for functional data classification
In many applications, input data are sampled functions taking their values in
infinite dimensional spaces rather than standard vectors. This fact has complex
consequences on data analysis algorithms that motivate modifications of them.
In fact most of the traditional data analysis tools for regression,
classification and clustering have been adapted to functional inputs under the
general name of functional Data Analysis (FDA). In this paper, we investigate
the use of Support Vector Machines (SVMs) for functional data analysis and we
focus on the problem of curves discrimination. SVMs are large margin classifier
tools based on implicit non linear mappings of the considered data into high
dimensional spaces thanks to kernels. We show how to define simple kernels that
take into account the unctional nature of the data and lead to consistent
classification. Experiments conducted on real world data emphasize the benefit
of taking into account some functional aspects of the problems.Comment: 13 page
Modifications to the Aesop's Fable paradigm change New Caledonian crow performances
While humans are able to understand much about causality, it is unclear to what extent non-human animals can do the same. The Aesop's Fable paradigm requires an animal to drop stones into a water-filled tube to bring a floating food reward within reach. Rook, Eurasian jay, and New Caledonian crow performances are similar to those of children under seven years of age when solving this task. However, we know very little about the cognition underpinning these birds' performances. Here, we address several limitations of previous Aesop's Fable studies to gain insight into the causal cognition of New Caledonian crows. Our results provide the first evidence that any non-human animal can solve the U-tube task and can discriminate between water-filled tubes of different volumes. However, our results do not provide support for the hypothesis that these crows can infer the presence of a hidden causal mechanism. They also call into question previous object-discrimination performances. The methodologies outlined here should allow for more powerful comparisons between humans and other animal species and thus help us to determine which aspects of causal cognition are distinct to humans.Publisher PDFPeer reviewe
- …