227 research outputs found

    Studies on automatic parallelization for heterogeneous and homogeneous multicore processors

    Get PDF
    制度:新 ; 報告番号:甲3537号 ; 学位の種類:博士(工学) ; 授与年月日:2012/2/25 ; 早大学位記番号:新587

    Accelerating Dynamical Density Response Code on Summit and Its Application for Computing the Density Response Function of Vanadium Sesquioxide

    Get PDF
    This thesis details the process of porting the Eguiluz group dynamical density response computational platform to the hybrid CPU+GPU environment at the Summit supercomputer at Oak Ridge National Laboratory (ORNL) Leadership Computing Center. The baseline CPU-only version is a Gordon Bell-winning platform within the formally-exact time-dependent density functional theory (TD-DFT) framework using the linearly augmented plane wave (LAPW) basis set. The code is accelerated using a combination of the OpenACC programming model and GPU libraries -- namely, the Matrix Algebra for GPU and Multicore Architectures (MAGMA) library -- as well as exploiting the sparsity pattern of the matrices involved in the matrix-matrix multiplication. Benchmarks show a 12.3x speedup compared to the CPU-only version. This performance boost should accelerate discovery in material and condensed matter physics through computational means. After the hybrid CPU+GPU code has been sufficiently optimized, it is used to study the dynamical density response function of vanadium sesquioxide, and the results are compared with spectroscopic data from non-resonant inelastic X-ray scattering {NIXS} experiments

    A Survey of AI Music Generation Tools and Models

    Full text link
    In this work, we provide a comprehensive survey of AI music generation tools, including both research projects and commercialized applications. To conduct our analysis, we classified music generation approaches into three categories: parameter-based, text-based, and visual-based classes. Our survey highlights the diverse possibilities and functional features of these tools, which cater to a wide range of users, from regular listeners to professional musicians. We observed that each tool has its own set of advantages and limitations. As a result, we have compiled a comprehensive list of these factors that should be considered during the tool selection process. Moreover, our survey offers critical insights into the underlying mechanisms and challenges of AI music generation

    THE USE OF RECOMMENDER SYSTEMS IN WEB APPLICATIONS – THE TROI CASE

    Get PDF
    Avoiding digital marketing, surveys, reviews and online users behavior approaches on digital age are the key elements for a powerful businesses to fail, there are some systems that should preceded some artificial intelligence techniques. In this direction, the use of data mining for recommending relevant items as a new state of the art technique is increasing user satisfaction as well as the business revenues. And other related information gathering approaches in order to our systems thing and acts like humans. To do so there is a Recommender System that will be elaborated in this thesis. How people interact, how to calculate accurately and identify what people like or dislike based on their online previous behaviors. The thesis includes also the methodologies recommender system uses, how math equations helps Recommender Systems to calculate user’s behavior and similarities. The filters are important on Recommender System, explaining if similar users like the same product or item, which is the probability of neighbor user to like also. Here comes collaborative filters, neighborhood filters, hybrid recommender system with the use of various algorithms the Recommender Systems has the ability to predict whether a particular user would prefer an item or not, based on the user’s profile and their activities. The use of Recommender Systems are beneficial to both service providers and users. Thesis cover also the strength and weaknesses of Recommender Systems and how involving Ontology can improve it. Ontology-based methods can be used to reduce problems that content-based recommender systems are known to suffer from. Based on Kosovar’s GDP and youngsters job perspectives are desirable for improvements, the demand is greater than the offer. I thought of building an intelligence system that will be making easier for Kosovars to find the appropriate job that suits their profile, skills, knowledge, character and locations. And that system is called TROI Search engine that indexes and merge all local operating job seeking websites in one platform with intelligence features. Thesis will present the design, implementation, testing and evaluation of a TROI search engine. Testing is done by getting user experiments while using running environment of TROI search engine. Results show that the functionality of the recommender system is satisfactory and helpful

    Energy-Efficient Softwarized Networks: A Survey

    Full text link
    With the dynamic demands and stringent requirements of various applications, networks need to be high-performance, scalable, and adaptive to changes. Researchers and industries view network softwarization as the best enabler for the evolution of networking to tackle current and prospective challenges. Network softwarization must provide programmability and flexibility to network infrastructures and allow agile management, along with higher control for operators. While satisfying the demands and requirements of network services, energy cannot be overlooked, considering the effects on the sustainability of the environment and business. This paper discusses energy efficiency in modern and future networks with three network softwarization technologies: SDN, NFV, and NS, introduced in an energy-oriented context. With that framework in mind, we review the literature based on network scenarios, control/MANO layers, and energy-efficiency strategies. Following that, we compare the references regarding approach, evaluation method, criterion, and metric attributes to demonstrate the state-of-the-art. Last, we analyze the classified literature, summarize lessons learned, and present ten essential concerns to open discussions about future research opportunities on energy-efficient softwarized networks.Comment: Accepted draft for publication in TNSM with minor updates and editin

    Data-Juicer: A One-Stop Data Processing System for Large Language Models

    Full text link
    The immense evolution in Large Language Models (LLMs) has underscored the importance of massive, diverse, and high-quality data. Despite this, existing open-source tools for LLM data processing remain limited and mostly tailored to specific datasets, with an emphasis on the reproducibility of released data over adaptability and usability, inhibiting potential applications. In response, we propose a one-stop, powerful yet flexible and user-friendly LLM data processing system named Data-Juicer. Our system offers over 50 built-in versatile operators and pluggable tools, which synergize modularity, composability, and extensibility dedicated to diverse LLM data processing needs. By incorporating visualized and automatic evaluation capabilities, Data-Juicer enables a timely feedback loop to accelerate data processing and gain data insights. To enhance usability, Data-Juicer provides out-of-the-box components for users with various backgrounds, and fruitful data recipes for LLM pre-training and post-tuning usages. Further, we employ multi-facet system optimization and seamlessly integrate Data-Juicer with both LLM and distributed computing ecosystems, to enable efficient and scalable data processing. Empirical validation of the generated data recipes reveals considerable improvements in LLaMA performance for various pre-training and post-tuning cases, demonstrating up to 7.45% relative improvement of averaged score across 16 LLM benchmarks and 16.25% higher win rate using pair-wise GPT-4 evaluation. The system's efficiency and scalability are also validated, supported by up to 88.7% reduction in single-machine processing time, 77.1% and 73.1% less memory and CPU usage respectively, and 7.91x processing acceleration when utilizing distributed computing ecosystems. Our system, data recipes, and multiple tutorial demos are released, calling for broader research centered on LLM data.Comment: Under continuous maintenance and updating; The system, refined data recipes, and demos are at https://github.com/alibaba/data-juice

    Intelligent Selection Techniques For Virtual Environments

    Get PDF
    Selection in 3D games and simulations is a well-studied problem. Many techniques have been created to address many of the typical scenarios a user could experience. For any single scenario with consistent conditions, there is likely a technique which is well suited. If there isn\u27t, then there is an opportunity for one to be created to best suit the expected conditions of that new scenario. It is critical that the user be given an appropriate technique to interact with their environment. Without it, the entire experience is at risk of becoming burdensome and not enjoyable. With all of the different possible scenarios, it can become problematic when two or more are part of the same program. If they are put closely together, or even intertwined, then the developer is often forced to pick a single technique that works so-so in both, but is likely not optimal for either, or maybe optimal in just one of them. In this case, the user is left to perform selections with a technique that is lacking in one way or another, which can increase errors and frustration. In our research, we have outlined different selection scenarios, all of which were classified by their level of object density (number of objects in scene) and object velocity. We then performed an initial study on how it impacts performance of various selection techniques, including a new selection technique that we developed just for this test, called Expand. Our results showed, among other things, that a standard Raycast technique works well in slow moving and sparse environments, while revealing that our new Expand technique works well in denser environments. With the results from our first study, we sought to develop something that would bridge the gap in performance between those selection techniques tested. Our idea was a framework that could harvest several different selection techniques and determine which was the most optimal at any time. Each selection technique would report how effective it was, given the provided scenario conditions. The framework was responsible for activating the appropriate selection technique when the user made a selection attempt. With this framework in hand, we performed two additional user studies to determine how effective it could be in actual use, and to identify its strengths and weaknesses. Each study compared several selection techniques individually against the framework which utilized them collectively, picking the most suitable. Again, the same scenarios from our first study were reused. From these studies, we gained a deeper understanding of the many challenges associated with automatic selection technique determination. The results from these two studies showed that transitioning between techniques was potentially viable, but rife with design challenges that made its optimization quite difficult. In an effort to sidestep some of the issues surrounding the switching of discrete techniques, we sought to attack the problem from the other direction, and make a single technique act similarly to two techniques, adjusting dynamically to conditions. We performed a user study to analyze the performance of such a technique, with promising results. While the qualitative differences were small, the user feedback did indicate that users preferred this technique over the others, which were static in nature. Finally, we sought to gain a deeper understanding of existing selection techniques that were dynamic in nature, and study how they were designed, and how they could be improved. We scrutinized the attributes of each technique that were already being adjusted dynamically or that could be adjusted and innovated new ways in which the technique could be improved upon. Within this analysis, we also gave thought to how each technique could be best integrated into the Auto-Select framework we proposed earlier. This overall analysis of the latest selection techniques left us with an array of new variants that warrant being created and tested against their existing versions. Our overall research goal was to perform an analysis of selection techniques that intelligently adapt to their environment. We believe that we achieved this by performing several iterative development cycles, including user studies and ultimately leading to innovation in the field of selection. We conclude our research with yet more questions left to be answered. We intend to pursue further research regarding some of these questions, as time permits
    corecore