1,882 research outputs found

    How Kano’s Performance Mediates Perceived SERVQUAL Impact on Kansei

    Get PDF
    Through Kansei Engineering (KE) methodology in services, the perceived service quality shows a direct impact on Kansei response. In order to strengthen the KE methodology, Kano model is embedded considering the attractive [A] and one-dimensional [O] performances. However, to what extent the Kano performance brings significant impact on Kansei is questionable and has not been explored yet. It is beneficial to measure the effort spent to improve a certain service attribute, considering the Kano performance and its impact on Kansei. This study on logistics services confirms that the Kano’s attractive category [A] shows the highest impact on Kansei (with loading of 0.502), followed by one-dimensional [O] and must-be [M] ones (with loadings of 0.514 and 0.507), respectively. The service provider should prioritize Kano’s [A] service attributes first for improvement. Keywords - Kano, logistics services, Kansei, SERVQUA

    Multi-domain maturity model for AI and analytic capability in power generation sector: A case study of ABB PAEN Oy

    Get PDF
    As more smart devices and smart meters are available on the market, industry actors offer AI and analytic suites and platforms where the data streams can be contextualized and leveraged in pre-made industry specific templates and model, together with self-serving machine learning environments. How can a traditional EPC company, use its domain knowledge in offering these AI and analytic suites. The assumption made is that there is no inherent value in the AI and analytics suite without data. How should this assumption be incorporated in projects executed before the operation phase where data from operation is non-existent.This thesis investigate which elements provide a value proposition in the AI and analytic suite and map this against the domain knowledge of the EPC company. The findings is a novel design in where both operational data is integrated into design for new projects. A survey is also conducted on the data utilization in the power generation sector based on the same elements. The findings is that while the granularity is low, the quality is good, with an overall maturity between managed and proactive data utilization, which indicate that there are few automated data streams, but that the data is available structurally and in a defined way

    ASCR/HEP Exascale Requirements Review Report

    Full text link
    This draft report summarizes and details the findings, results, and recommendations derived from the ASCR/HEP Exascale Requirements Review meeting held in June, 2015. The main conclusions are as follows. 1) Larger, more capable computing and data facilities are needed to support HEP science goals in all three frontiers: Energy, Intensity, and Cosmic. The expected scale of the demand at the 2025 timescale is at least two orders of magnitude -- and in some cases greater -- than that available currently. 2) The growth rate of data produced by simulations is overwhelming the current ability, of both facilities and researchers, to store and analyze it. Additional resources and new techniques for data analysis are urgently needed. 3) Data rates and volumes from HEP experimental facilities are also straining the ability to store and analyze large and complex data volumes. Appropriately configured leadership-class facilities can play a transformational role in enabling scientific discovery from these datasets. 4) A close integration of HPC simulation and data analysis will aid greatly in interpreting results from HEP experiments. Such an integration will minimize data movement and facilitate interdependent workflows. 5) Long-range planning between HEP and ASCR will be required to meet HEP's research needs. To best use ASCR HPC resources the experimental HEP program needs a) an established long-term plan for access to ASCR computational and data resources, b) an ability to map workflows onto HPC resources, c) the ability for ASCR facilities to accommodate workflows run by collaborations that can have thousands of individual members, d) to transition codes to the next-generation HPC platforms that will be available at ASCR facilities, e) to build up and train a workforce capable of developing and using simulations and analysis to support HEP scientific research on next-generation systems.Comment: 77 pages, 13 Figures; draft report, subject to further revisio

    Big Data and Large-scale Data Analytics: Efficiency of Sustainable Scalability and Security of Centralized Clouds and Edge Deployment Architectures

    Get PDF
    One of the significant shifts of the next-generation computing technologies will certainly be in the development of Big Data (BD) deployment architectures. Apache Hadoop, the BD landmark, evolved as a widely deployed BD operating system. Its new features include federation structure and many associated frameworks, which provide Hadoop 3.x with the maturity to serve different markets. This dissertation addresses two leading issues involved in exploiting BD and large-scale data analytics realm using the Hadoop platform. Namely, (i)Scalability that directly affects the system performance and overall throughput using portable Docker containers. (ii) Security that spread the adoption of data protection practices among practitioners using access controls. An Enhanced Mapreduce Environment (EME), OPportunistic and Elastic Resource Allocation (OPERA) scheduler, BD Federation Access Broker (BDFAB), and a Secure Intelligent Transportation System (SITS) of multi-tiers architecture for data streaming to the cloud computing are the main contribution of this thesis study

    Financing Fisheries Change: Learning from Case Studies

    Get PDF
    The fields of fisheries sustainability and conservation have evolved and grown considerably over the past decade. This evolution, its broad scope and the scale of capital needed for support will require project developers to seek the support and guidance of an array of investors, both in the non-profit and for-profit sectors. Non-profits, social change leaders and business entrepreneurs will need to work together to create innovatively structured projects that can both build value for private investors and improve the speed and scale of fisheries conservation impacts. This report presents case studies of groups who have incorporated innovative financing structures and partnerships into their strategies, and analyzes the lessons learned to offer investors and NGOs guidance for future projects. The 11 cases presented are divided into three groups depending on how conservation and financing strategies are tied together. The groups are:Assuring conservation through ownership: Using equity for asset purchase with an exit strategy.Promoting conservation through targeted lending: Filling credit gaps with debt instruments.Enabling conservation by combining services and capital: Incubating and providing information, connections and financing to promote business developmen

    \u3ci\u3eThe Conference Proceedings of the 2003 Air Transport Research Society (ATRS) World Conference, Volume 1\u3c/i\u3e

    Get PDF
    UNOAI Report 03-5https://digitalcommons.unomaha.edu/facultybooks/1131/thumbnail.jp

    Rise of the Planet of Serverless Computing: A Systematic Review

    Get PDF
    Serverless computing is an emerging cloud computing paradigm, being adopted to develop a wide range of software applications. It allows developers to focus on the application logic in the granularity of function, thereby freeing developers from tedious and error-prone infrastructure management. Meanwhile, its unique characteristic poses new challenges to the development and deployment of serverless-based applications. To tackle these challenges, enormous research efforts have been devoted. This paper provides a comprehensive literature review to characterize the current research state of serverless computing. Specifically, this paper covers 164 papers on 17 research directions of serverless computing, including performance optimization, programming framework, application migration, multi-cloud development, testing and debugging, etc. It also derives research trends, focus, and commonly-used platforms for serverless computing, as well as promising research opportunities

    The role of digital servitization in mitigating the impacts of Covid-19 pandemic: the case of italian manufacturing companies.

    Get PDF
    A multimonthly research has been carried out in an attempt to demonstrate how services and digital technologies behaved during the pandemic. In particular, manufacturers who were ready under a digital and service perspectives, were those who reacted faster in the initial months of the pandemic. The research has also highlighted that further investments will be allocated to service and technologies
    • …
    corecore