11,044 research outputs found

    Signed double Roman domination on cubic graphs

    Full text link
    The signed double Roman domination problem is a combinatorial optimization problem on a graph asking to assign a label from {±1,2,3}\{\pm{}1,2,3\} to each vertex feasibly, such that the total sum of assigned labels is minimized. Here feasibility is given whenever (i) vertices labeled ±1\pm{}1 have at least one neighbor with label in {2,3}\{2,3\}; (ii) each vertex labeled −1-1 has one 33-labeled neighbor or at least two 22-labeled neighbors; and (iii) the sum of labels over the closed neighborhood of any vertex is positive. The cumulative weight of an optimal labeling is called signed double Roman domination number (SDRDN). In this work, we first consider the problem on general cubic graphs of order nn for which we present a sharp n/2+Θ(1)n/2+\Theta(1) lower bound for the SDRDN by means of the discharging method. Moreover, we derive a new best upper bound. Observing that we are often able to minimize the SDRDN over the class of cubic graphs of a fixed order, we then study in this context generalized Petersen graphs for independent interest, for which we propose a constraint programming guided proof. We then use these insights to determine the SDRDNs of subcubic 2×m2\times m grid graphs, among other results

    Green Carbon Footprint for Model Inference Serving via Exploiting Mixed-Quality Models and GPU Partitioning

    Full text link
    This paper presents a solution to the challenge of mitigating carbon emissions from large-scale high performance computing (HPC) systems and datacenters that host machine learning (ML) inference services. ML inference is critical to modern technology products, but it is also a significant contributor to datacenter compute cycles and carbon emissions. We introduce Clover, a carbon-friendly ML inference service runtime system that balances performance, accuracy, and carbon emissions through mixed-quality models and GPU resource partitioning. Our experimental results demonstrate that Clover is effective in substantially reducing carbon emissions while maintaining high accuracy and meeting service level agreement (SLA) targets. Therefore, it is a promising solution toward achieving carbon neutrality in HPC systems and datacenters

    An Online Resource Scheduling for Maximizing Quality-of-Experience in Meta Computing

    Full text link
    Meta Computing is a new computing paradigm, which aims to solve the problem of computing islands in current edge computing paradigms and integrate all the resources on a network by incorporating cloud, edge, and particularly terminal-end devices. It throws light on solving the problem of lacking computing power. However, at this stage, due to technical limitations, it is impossible to integrate the resources of the whole network. Thus, we create a new meta computing architecture composed of multiple meta computers, each of which integrates the resources in a small-scale network. To make meta computing widely applied in society, the service quality and user experience of meta computing cannot be ignored. Consider a meta computing system providing services for users by scheduling meta computers, how to choose from multiple meta computers to achieve maximum Quality-of-Experience (QoE) with limited budgets especially when the true expected QoE of each meta computer is not known as a priori? The existing studies, however, usually ignore the costs and budgets and barely consider the ubiquitous law of diminishing marginal utility. In this paper, we formulate a resource scheduling problem from the perspective of the multi-armed bandit (MAB). To determine a scheduling strategy that can maximize the total QoE utility under a limited budget, we propose an upper confidence bound (UCB) based algorithm and model the utility of service by using a concave function of total QoE to characterize the marginal utility in the real world. We theoretically upper bound the regret of our proposed algorithm with sublinear growth to the budget. Finally, extensive experiments are conducted, and the results indicate the correctness and effectiveness of our algorithm

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Technical Dimensions of Programming Systems

    Get PDF
    Programming requires much more than just writing code in a programming language. It is usually done in the context of a stateful environment, by interacting with a system through a graphical user interface. Yet, this wide space of possibilities lacks a common structure for navigation. Work on programming systems fails to form a coherent body of research, making it hard to improve on past work and advance the state of the art. In computer science, much has been said and done to allow comparison of programming languages, yet no similar theory exists for programming systems; we believe that programming systems deserve a theory too. We present a framework of technical dimensions which capture the underlying characteristics of programming systems and provide a means for conceptualizing and comparing them. We identify technical dimensions by examining past influential programming systems and reviewing their design principles, technical capabilities, and styles of user interaction. Technical dimensions capture characteristics that may be studied, compared and advanced independently. This makes it possible to talk about programming systems in a way that can be shared and constructively debated rather than relying solely on personal impressions. Our framework is derived using a qualitative analysis of past programming systems. We outline two concrete ways of using our framework. First, we show how it can analyze a recently developed novel programming system. Then, we use it to identify an interesting unexplored point in the design space of programming systems. Much research effort focuses on building programming systems that are easier to use, accessible to non-experts, moldable and/or powerful, but such efforts are disconnected. They are informal, guided by the personal vision of their authors and thus are only evaluable and comparable on the basis of individual experience using them. By providing foundations for more systematic research, we can help programming systems researchers to stand, at last, on the shoulders of giants

    Interactive documentary and non-profit engagement: building an inclusive story practice in heritage

    Get PDF
    This thesis examines the potential of interactive documentary (i-Doc) as an inclusive model of storytelling that can benefit non-profit organizations in widening public participation, improving engagement and supporting fundraising. Through the case study of a single organization, the Council for British Archaeology (CBA), the research addresses the diversity gap in heritage storytelling as the first step in improving public participation from diverse audiences. Through the analysis of primary and secondary data from the CBA, the study identifies a lack of understanding of immigrant perspectives of heritage in the UK, proposing to create an i-Doc that forefronts such underrepresented voices through a co-creative film process. By adopting a participatory approach, the study then develops an interactive storytelling platform through an iterative design process with participants, exploring personal themes of archaeology and heritage in an everyday context. The result of the study is a working model of an i-Doc that presents digital stories of the participants. This storytelling model offers the CBA the opportunity to increase their digital visibility, supporting their goals of using innovative creative avenues to reach new audiences, diversify their membership and support their long-term fundraising strategies. The research contributes to new insights in adopting the interactive documentary as a creative pathway to building inclusive and robust engagement methods that can benefit non-profit organizations, considering the challenges and opportunities that such technologies present

    D-STACK: High Throughput DNN Inference by Effective Multiplexing and Spatio-Temporal Scheduling of GPUs

    Full text link
    Hardware accelerators such as GPUs are required for real-time, low-latency inference with Deep Neural Networks (DNN). However, due to the inherent limits to the parallelism they can exploit, DNNs often under-utilize the capacity of today's high-end accelerators. Although spatial multiplexing of the GPU, leads to higher GPU utilization and higher inference throughput, there remain a number of challenges. Finding the GPU percentage for right-sizing the GPU for each DNN through profiling, determining an optimal batching of requests to balance throughput improvement while meeting application-specific deadlines and service level objectives (SLOs), and maximizing throughput by appropriately scheduling DNNs are still significant challenges. This paper introduces a dynamic and fair spatio-temporal scheduler (D-STACK) that enables multiple DNNs to run in the GPU concurrently. To help allocate the appropriate GPU percentage (we call it the "Knee"), we develop and validate a model that estimates the parallelism each DNN can utilize. We also develop a lightweight optimization formulation to find an efficient batch size for each DNN operating with D-STACK. We bring together our optimizations and our spatio-temporal scheduler to provide a holistic inference framework. We demonstrate its ability to provide high throughput while meeting application SLOs. We compare D-STACK with an ideal scheduler that can allocate the right GPU percentage for every DNN kernel. D-STACK gets higher than 90 percent throughput and GPU utilization compared to the ideal scheduler. We also compare D-STACK with other GPU multiplexing and scheduling methods (e.g., NVIDIA Triton, Clipper, Nexus), using popular DNN models. Our controlled experiments with multiplexing several popular DNN models achieve up to 1.6X improvement in GPU utilization and up to 4X improvement in inference throughput

    Special Topics in Information Technology

    Get PDF
    This open access book presents thirteen outstanding doctoral dissertations in Information Technology from the Department of Electronics, Information and Bioengineering, Politecnico di Milano, Italy. Information Technology has always been highly interdisciplinary, as many aspects have to be considered in IT systems. The doctoral studies program in IT at Politecnico di Milano emphasizes this interdisciplinary nature, which is becoming more and more important in recent technological advances, in collaborative projects, and in the education of young researchers. Accordingly, the focus of advanced research is on pursuing a rigorous approach to specific research topics starting from a broad background in various areas of Information Technology, especially Computer Science and Engineering, Electronics, Systems and Control, and Telecommunications. Each year, more than 50 PhDs graduate from the program. This book gathers the outcomes of the thirteen best theses defended in 2020-21 and selected for the IT PhD Award. Each of the authors provides a chapter summarizing his/her findings, including an introduction, description of methods, main achievements and future work on the topic. Hence, the book provides a cutting-edge overview of the latest research trends in Information Technology at Politecnico di Milano, presented in an easy-to-read format that will also appeal to non-specialists

    CITIES: Energetic Efficiency, Sustainability; Infrastructures, Energy and the Environment; Mobility and IoT; Governance and Citizenship

    Get PDF
    This book collects important contributions on smart cities. This book was created in collaboration with the ICSC-CITIES2020, held in San José (Costa Rica) in 2020. This book collects articles on: energetic efficiency and sustainability; infrastructures, energy and the environment; mobility and IoT; governance and citizenship
    • …
    corecore