1,659 research outputs found

    Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505

    Full text link

    A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    Full text link

    Secure Integration of Desktop Grids and Compute Clusters Based on Virtualization and Meta-Scheduling

    Get PDF
    Reducing the cost for business or scientific computations, is a commonly expressed goal in today’s companies. Using the available computers of local employees or the outsourcing of such computations are two obvious solutions to save money for additional hardware. Both possibilities exhibit security related disadvantages, since the deployed software and data can be copied or tampered if appropriate countermeasures are not taken. In this paper, an approach is presented to let a local desktop machines and remote cluster resources be securely combined into a singel Grid environment. Solutions to several problems in the areas of secure virtual networks, meta-scheduling and accessing cluster schedulers from desktop Grids are proposed

    Machine translation evaluation resources and methods: a survey

    Get PDF
    We introduce the Machine Translation (MT) evaluation survey that contains both manual and automatic evaluation methods. The traditional human evaluation criteria mainly include the intelligibility, fidelity, fluency, adequacy, comprehension, and informativeness. The advanced human assessments include task-oriented measures, post-editing, segment ranking, and extended criteriea, etc. We classify the automatic evaluation methods into two categories, including lexical similarity scenario and linguistic features application. The lexical similarity methods contain edit distance, precision, recall, F-measure, and word order. The linguistic features can be divided into syntactic features and semantic features respectively. The syntactic features include part of speech tag, phrase types and sentence structures, and the semantic features include named entity, synonyms, textual entailment, paraphrase, semantic roles, and language models. The deep learning models for evaluation are very newly proposed. Subsequently, we also introduce the evaluation methods for MT evaluation including different correlation scores, and the recent quality estimation (QE) tasks for MT. This paper differs from the existing works\cite {GALEprogram2009, EuroMatrixProject2007} from several aspects, by introducing some recent development of MT evaluation measures, the different classifications from manual to automatic evaluation measures, the introduction of recent QE tasks of MT, and the concise construction of the content

    Contributions à la réplication de données dans les systèmes distribués à grande échelle

    Get PDF
    Data replication is a key mechanism for building a reliable and efficient data management system. Indeed, by keeping several replicas for each piece of data, it is possible to improve durability. Furthermore, well-placed copies reduce data accesstime. However, having multiple copies for a single piece of data creates consistency problems when the data is updated. Over the last years, I made contributions related to these three aspects: data durability, data access performance and data consistency. RelaxDHT and SPLAD enhance data durability by placing data copies smartly. Caju, AREN and POPS reduce access time by improving data locality and by taking popularity into account. To enhance data lookup performance, DONUT creates efficient shortcuts taking data distribution into account. Finally, in the replicated database context, Gargamel parallelizes independent transactions only, improving database performance and avoiding aborting transactions. My research has been carried out in collaboration with height PhD students, four of which have defended. In my future work, I plan to extend these contributions by (i) designing a storage system tailored for MMOGs, which are very demanding, and (ii) designing a data management system that is able to re-distribute data automatically in order to scale the number of servers up and down according to the changing workload, leading to a greener data management.La réplication de données est une technique clé pour permettre aux systèmes de gestion de données distribués à grande échelle d'offrir un stockage fiable et performant. Comme il gère un nombre suffisant de copies de chaque donnée, le système peut améliorer la pérennité. De plus, la présence de copies bien placées réduit les temps d'accès. Cependant, cette même existence de plusieurs copies pose des problèmes de cohérence en cas de modification. Ces dernières années, mes contributions ont porté sur ces trois aspects liés à la réplication de données: la pérennité des données, la performance desaccès et la gestion de la cohérence. RelaxDHT et SPLAD permettent d'améliorer la pérennité des données en jouant sur le placement des copies. Caju, AREN et POPS permettent de réduire les temps d'accès aux données en améliorant la localité et en prenant en compte la popularité. Pour accélérer la localisation des copies, DONUT crée des raccourcis efficaces prenant en compte la distribution des données. Enfin, dans le contexte des bases de données répliquées,Gargamel permet de ne paralléliser que les transactions qui sont indépendantes, améliorant ainsi les performances et évitant tout abandon de transaction pour cause de conflit. Ces travaux ont été réalisés avec huit étudiants en thèse dont quatre ont soutenu. Pour l'avenir, je me propose d'étendre ces travaux, d'une part en concevant un système de gestion de données pour les MMOGs, une classe d'application particulièrement exigeante; et, d'autre part, en concevant des mécanismes de gestion de données permettant de n'utiliser que la quantité strictement nécessaire de ressources, en redistribuant dynamiquement les données en fonction des besoins, un pas vers une gestion plus écologique des données

    An Intentionalist Approach to the Distinction between Monetary and Economic Policies

    Get PDF
    This thesis investigates the necessity to draw a legal demarcation line between monetary policy and economic policy. While a distinction between monetary and economic policies makes no sense from an economic perspective, it has nonetheless been the subject of challenge on three occasions before the CJEU. Those judgments led the literature to consider that any legal distinction would either be “doomed to failure” or “arbitrary”. However, the recent judgment issued by the FCC, declaring ultra vires the PSPP and the Weiss and others judgment, put this legal issue at the heart of the European legal order. Consequently, this thesis aims to answer the following research question: In light of the intent of the authors of the Treaties, how should monetary and economic policies be legally distinguished to respect the principle of conferral?. Similarly to the CJEU in Weiss and others, this thesis employs an intentionalist methodology to answer this research question. More specifically, this thesis examines, in a chronological order, relevant documents that have been researched in the Historical Archives of the European Union to appreciate the intent of the authors of the Treaties. Based on a myriad of historical sources, this thesis finds that the intent of the authors of the Treaties is much more complex and paradoxical than has been claimed by the CJEU. Indeed, after examining the conceptualization of the single monetary policy, this historical analysis is confronted with the judgments issued by the CJEU in Pringle, Gauweiler, and Weiss and others. In essence, this thesis finds that the CJEU misperceived the intent of the authors of the Treaties. Additionally, by cautiously examining the written observations submitted by the parties to the proceedings, this thesis also sheds new lights on the reasoning of the Court. After considering the judgment issued by the FCC, this thesis examines the necessity to draw a legal demarcation line that would not be arbitrary. In that regard, and similarly to Council Regulation (EC) 3603/93, this thesis concludes by proposing to specify the intent of the authors of the Treaties by means of an act of secondary law

    A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    Full text link

    The Treaty of Lisbon: Implementing the Institutional Innovations. CEPS Special Reports, November 2007

    Get PDF
    After a long period of internal introspection and deadlock over the Constitutional Treaty, the EU can now see some light at the end of the tunnel. If successfully ratified, the new European Treaty agreed by the Head of States and Government in Lisbon may provide the appropriate institutional tools for the EU to function with 27 member states. However, the success of institutional innovations depends not only on legal provisions, but also on the way in which the provisions are implemented. Indeed, even a cursory examination indicates that the implementation of the new proposals is unlikely to be easy, and in some cases could be a source of serious difficulties in the future. In the absence of serious analysis aimed at this latter question, three Brussels-based think-tanks have joined forces in a collaborative effort to fill this gap. Our aim is to highlight potential problems and, where possible, to suggest ways to avoid or attenuate their negative effects. The analysis has focused on seven main institutional and policy domains: the European Parliament, the European Commission, the Presidency of the Council, the qualified majority voting in the Council, the role of national Parliaments, enhanced cooperation and foreign policy. These issues have been intensively debated in working groups composed of researchers, external experts, and practitioners in the field. This report reflects the substance of that collective effort

    International Union of Theoretical and Applied Mechanics : report 2003

    Get PDF
    corecore