1,149 research outputs found
Analysis of Potential Value Chains for Scaling up Climate-Smart Agriculture in West Africa
Despite the development of several CSA options and their positive gains, their wide scale adoption remains a challenge. Integrating the value chain analysis into the Climate-Smart Village (CSV) - Agricultural Research for Development (AR4D) approach sounds positioning as an effective approach for upscaling of CSA
Taking advantage of hybrid systems for sparse direct solvers via task-based runtimes
The ongoing hardware evolution exhibits an escalation in the number, as well
as in the heterogeneity, of computing resources. The pressure to maintain
reasonable levels of performance and portability forces application developers
to leave the traditional programming paradigms and explore alternative
solutions. PaStiX is a parallel sparse direct solver, based on a dynamic
scheduler for modern hierarchical manycore architectures. In this paper, we
study the benefits and limits of replacing the highly specialized internal
scheduler of the PaStiX solver with two generic runtime systems: PaRSEC and
StarPU. The tasks graph of the factorization step is made available to the two
runtimes, providing them the opportunity to process and optimize its traversal
in order to maximize the algorithm efficiency for the targeted hardware
platform. A comparative study of the performance of the PaStiX solver on top of
its native internal scheduler, PaRSEC, and StarPU frameworks, on different
execution environments, is performed. The analysis highlights that these
generic task-based runtimes achieve comparable results to the
application-optimized embedded scheduler on homogeneous platforms. Furthermore,
they are able to significantly speed up the solver on heterogeneous
environments by taking advantage of the accelerators while hiding the
complexity of their efficient manipulation from the programmer.Comment: Heterogeneity in Computing Workshop (2014
Metal enrichment in a semi-analytical model, fundamental scaling relations, and the case of Milky Way galaxies
Gas flows play a fundamental role in galaxy formation and evolution,
providing the fuel for the star formation process. These mechanisms leave an
imprint in the amount of heavy elements. Thus, the analysis of this metallicity
signature provides additional constraint on the galaxy formation scenario. We
aim to discriminate between four different galaxy formation models based on two
accretion scenarios and two different star formation recipes. We address the
impact of a bimodal accretion scenario and a strongly regulated star formation
recipe. We present a new extension of the eGalICS model, which allows us to
track the metal enrichment process. Our new chemodynamical model is applicable
for situations ranging from metal-free primordial accretion to very enriched
interstellar gas contents. We use this new tool to predict the metallicity
evolution of both the stellar populations and gas phase. We also address the
evolution of the gas metallicity with the star formation rate (SFR). We then
focus on a sub-sample of Milky Way-like galaxies. We compare both the cosmic
stellar mass assembly and the metal enrichment process of such galaxies with
observations and detailed chemical evolution models. Our models, based on a
strong star formation regulation, allow us to reproduce well the stellar mass
to gas-phase metallicity relation observed in the local universe. However, we
observe a systematic shift towards high masses. Our $Mstar-Zg-SFR relation is
in good agreement with recent measurements: our best model predicts a clear
dependence with the SFR. Both SFR and metal enrichment histories of our Milky
Way-like galaxies are consistent with observational measurements and detailed
chemical evolution models. We finally show that Milky Way progenitors start
their evolution below the observed main sequence and progressively reach this
observed relation at z = 0.Comment: 22 pages, 11 figure
Ab-initio Modeling of CBRAM Cells: from Ballistic Transport Properties to Electro-Thermal Effects
We present atomistic simulations of conductive bridging random access memory
(CBRAM) cells from first-principles combining density-functional theory and the
Non-equilibrium Green's Function formalism. Realistic device structures with an
atomic-scale filament connecting two metallic contacts have been constructed.
Their transport properties have been studied in the ballistic limit and in the
presence of electron-phonon scattering, showing good agreement with
experimental data. It has been found that the relocation of few atoms is
sufficient to change the resistance of the CBRAM by 6 orders of magnitude, that
the electron trajectories strongly depend on the filament morphology, and that
self-heating does not affect the device performance at currents below 1 A.Comment: 6 figures, conferenc
L'impact de la culture organisationnelle sur l'utilisation des données probantes chez les intervenants et les gestionnaires en intervention sociale
Introduction : Plusieurs déterminants (attributs des connaissances, facteurs individuels, environnementaux et organisationnels) expliquent l’utilisation limitée des données probantes dans la pratique des intervenants sociaux du réseau de la santé et des services sociaux. La présente étude s’intéresse plus spécifiquement à l’impact de la culture organisationnelle (CO) sur l’utilisation des données probantes (UDP). Elle vise à comparer les points de vue des gestionnaires et des intervenants concernant 1- la culture organisationnelle, 2- les capacités organisationnelles d’UDP, 3- l’UDP et à 4- établir le degré de corrélation entre les types de CO et l’UDP. Méthodologie : La présente étude s’appuie sur une analyse secondaire de données. Les données primaires proviennent d’une étude sur l’évaluation de la culture organisationnelle et du transfert et de l’utilisation des connaissances réalisée dans un centre jeunesse. Le devis de recherche de la présente étude est à la fois quantitatif et exploratoire. Le type d’échantillonnage utilisé est un échantillon de volontaires totalisant 188 employés d’un centre jeunesse. Résultats : Dans l’ensemble, les résultats de nos analyses indiquent que la culture hiérarchique caractérise le fonctionnement du centre jeunesse. De plus, les gestionnaires perçoivent plus positivement les capacités organisationnelles d’UDP et utilisent davantage les données probantes que les intervenants. Finalement, l'analyse des résultats démontre que la culture hiérarchique est liée négativement à l'UDP alors que les cultures de développement et de groupe le sont positivement. Conclusion: Il semble donc que les organisations soucieuses de favoriser l’UDP dans le domaine des services sociaux devraient envisager un changement ou un ajustement de culture organisationnelle qui s’appuie sur l’exploitation des caractéristiques relatives à une culture de groupe et de développement. Cependant, d’autres études sont nécessaires pour valider les résultats de notre étude et pour approfondir notre compréhension des mécanismes qui soustendent la relation entre la CO et l’UDP dans le domaine des services sociaux
Regard de futurs enseignants sur l’importance des compétences TIC (Internet) pour les jeunes et la responsabilité de divers intervenants à cet égard
Au 21e siècle, les compétences « TIC » sont importantes pour l’intégration des individus à la société et la compétitivité des nations. Plusieurs nations ont d’ailleurs ajusté leurs curriculums, attribuant cette responsabilité à l’école. Mais qu’en pensent les futurs enseignants? Considèrent-ils qu’il revient à l’école de prendre en charge le développement de ces compétences? À cet égard, nous avons demandé à 328 futurs enseignants suisses, français et québécois de se positionner, à l’aide d’une échelle de Likert, quant à l’importance de 21 compétences/connaissances TIC et de nous préciser qui devrait, selon eux, être responsable de l’encadrement du développement de chacune.In the 21st century, « ICT » skills are important for the integration of individuals into society and the competitiveness of nations. Several nations have adjusted their curricula, assigning this responsibility to the school. But what do pre-service teachers think? Do they believe it is for the school to support the development of these skills? In this regard, we asked 328 student teachers from Switzerland, France and Quebec to position themself about the importance of 21 ICT skills/knowledge using a Likert scale and tell us who should, according to themselves, be responsible for supervising the development of each
Engineering the Frequency Spectrum of Bright Squeezed Vacuum via Group Velocity Dispersion in an SU(1,1) Interferometer
Bright squeezed vacuum, a promising tool for quantum information, can be
generated by high-gain parametric down-conversion. However, its frequency and
angular spectra are typically quite broad, which is undesirable for
applications requiring single-mode radiation. We tailor the frequency spectrum
of high-gain parametric down-conversion using an SU(1,1) interferometer
consisting of two nonlinear crystals with a dispersive medium separating them.
The dispersive medium allows us to select a narrow band of the frequency
spectrum to be exponentially amplified by high-gain parametric amplification.
The frequency spectrum is thereby narrowed from (56.5 +- 0.1) to (1.22 +- 0.02)
THz and, in doing so, the number of frequency modes is reduced from
approximately 50 to 1.82 +- 0.02. Moreover, this method provides control and
flexibility over the spectrum of the generated light through the timing of the
pump.Comment: 6 pages, 5 figure
A Lower Bound and a Near-Optimal Algorithm for Bilevel Empirical Risk Minimization
Bilevel optimization problems, which are problems where two optimization
problems are nested, have more and more applications in machine learning. In
many practical cases, the upper and the lower objectives correspond to
empirical risk minimization problems and therefore have a sum structure. In
this context, we propose a bilevel extension of the celebrated SARAH algorithm.
We demonstrate that the algorithm requires
gradient computations to achieve
-stationarity with the total number of samples, which
improves over all previous bilevel algorithms. Moreover, we provide a lower
bound on the number of oracle calls required to get an approximate stationary
point of the objective function of the bilevel problem. This lower bound is
attained by our algorithm, which is therefore optimal in terms of sample
complexity
- …