1,376 research outputs found

    Exploring the roles of people, governance and technology in organizational readiness for emerging technologies

    Get PDF
    The rapid development and release of emerging technologies have made their adoption challenging. Most often there are failing issues in organizational adoption of emerging technologies. It is yet unclear which component(s) of organization play the prominent role(s) in organizational readiness to adopt emerging technologies. Using a mixed method, this study conducted an online survey of 83 South African organizations for server virtualization adoption. Server virtualization is an emerging technology being widely adopted in most organizations in developed countries. IT executives rated server virtualization as the second-most important technology to help achieve cost reductions and optimize productivity in recent surveys. Very little is known about server virtualization adoption in organizations in developing countries. It was found that people and technology play prominent roles in South African organizational readiness to adopt server virtualization. Server virtualization has certain inhibitors such as lack of IT skill, and software and license costs that the IT industry and adopting organizations should consider

    Data virtualization design model for near real time decision making in business intelligence environment

    Get PDF
    The main purpose of Business Intelligence (BI) is to focus on supporting an organization‘s strategic, operational and tactical decisions by providing comprehensive, accurate and vivid data to the decision makers. A data warehouse (DW), which is considered as the input for decision making system activities is created through a complex process known as Extract, Transform and Load (ETL). ETL operates at pre-defined times and requires time to process and transfer data. However, providing near real time information to facilitate the data integration in supporting decision making process is a known issue. Inaccessibility to near realtime information could be overcome with Data Virtualization (DV) as it provides unified, abstracted, near real time, and encapsulated view of information for querying. Nevertheless, currently, there are lack of studies on the BI model for developing and managing data in virtual manner that can fulfil the organization needs. Therefore, the main aim of this study is to propose a DV model for near-real time decision making in BI environment. Design science research methodology was adopted to accomplish the research objectives. As a result of this study, a model called Data Virtualization Development Model (DVDeM) is proposed that addresses the phases and components which affect the BI environment. To validate the model, expert reviews and focus group discussions were conducted. A prototype based on the proposed model was also developed, and then implemented in two case studies. Also, an instrument was developed to measure the usability of the prototype in providing near real time data. In total, 60 participants were involved and the findings indicated that 93% of the participants agreed that the DVDeM based prototype was able to provide near real-time data for supporting decision-making process. From the studies, the findings also showed that the majority of the participants (more than 90%) in both of education and business sectors, have affirmed the workability of the DVDeM and the usability of the prototype in particular able to deliver near real-time decision-making data. Findings also indicate theoretical and practical contributions for developers to develop efficient BI applications using DV technique. Also, the mean values for each measurement item are greater than 4 indicating that the respondents agreed with the statement for each measurement item. Meanwhile, it was found that the mean scores for overall usability attributes of DVDeM design model fall under "High" or "Fairly High". Therefore, the results show sufficient indications that by adopting DVDeM model in developing a system, the usability of the produced system is perceived by the majority of respondents as high and is able to support near real time decision making data

    Notes on Cloud computing principles

    Get PDF
    This letter provides a review of fundamental distributed systems and economic Cloud computing principles. These principles are frequently deployed in their respective fields, but their inter-dependencies are often neglected. Given that Cloud Computing first and foremost is a new business model, a new model to sell computational resources, the understanding of these concepts is facilitated by treating them in unison. Here, we review some of the most important concepts and how they relate to each other

    Mixed-mode multicore reliability

    Get PDF
    Future processors are expected to observe increasing rates of hardware faults. Using Dual-Modular Redundancy (DMR), two cores of a multicore can be loosely coupled to redundantly execute a single software thread, providing very high coverage from many difference sources of faults. This reliability, however, comes at a high price in terms of per-thread IPC and overall system throughput. We make the observation that a user may want to run both applications requiring high reliability, such as financial software, and more fault tolerant applications requiring high performance, such as media or web software, on the same machine at the same time. Yet a traditional DMR system must fully operate in redundant mode whenever any application requires high reliability. This paper proposes a Mixed-Mode Multicore (MMM), which enables most applications, including the system software, to run with high reliability in DMR mode, while applications that need high performance can avoid the penalty of DMR. Though conceptually simple, two key challenges arise: 1) care must be taken to protect reliable applications from any faults occurring to applications running in high performance mode, and 2) the desire to execute additional independent software threads for a performance application complicates the scheduling of computation to cores. After solving these issues, an MMM is shown to improve overall system performance, compared to a traditional DMR system, by approximately 2X when one reliable and one performance application are concurrently executing

    A Process Framework for Managing Quality of Service in Private Cloud

    Get PDF
    As information systems leaders tap into the global market of cloud computing-based services, they struggle to maintain consistent application performance due to lack of a process framework for managing quality of service (QoS) in the cloud. Guided by the disruptive innovation theory, the purpose of this case study was to identify a process framework for meeting the QoS requirements of private cloud service users. Private cloud implementation was explored by selecting an organization in California through purposeful sampling. Information was gathered by interviewing 23 information technology (IT) professionals, a mix of frontline engineers, managers, and leaders involved in the implementation of private cloud. Another source of data was documents such as standard operating procedures, policies, and guidelines related to private cloud implementation. Interview transcripts and documents were coded and sequentially analyzed. Three prominent themes emerged from the analysis of data: (a) end user expectations, (b) application architecture, and (c) trending analysis. The findings of this study may help IT leaders in effectively managing QoS in cloud infrastructure and deliver reliable application performance that may help in increasing customer population and profitability of organizations. This study may contribute to positive social change as information systems managers and workers can learn and apply the process framework for delivering stable and reliable cloud-hosted computer applications

    Framework on Economical Implication and Issues of SADU Implementation

    Get PDF
    Due to software which plays an increasingly role in everyday life, interaction betweenhumans and computers will increase in importance; therefore, the ability to support interactions forefficient re-use of experience is a major challenge for systems in the future. Trace Based Reasoningwill have a significant impact on applications sharing experience, when they are based on the web inparticular, since traces allow us to imagine several ways of interaction in systems and to combinemultiple modes of interaction in a single system. In the conducted study we aimed at developing anAssist System of Human Diagnostician (SADU), meaning that this system will have the humanknowledge and then information retrieved by interaction with humans at the SADU request

    Development and assessment of an organisational readiness framework for emerging technologies : an investigation of antecedents for South African organisations' readiness for server virtualisation

    Get PDF
    Includes abstract.Includes bibliographical references (leaves 112-125).To determine, holistically, factors that contribute to organisational readiness for these emerging technologies on one part, and the factors that influence organisational preparedness on its own on the other part, raises another concern. This study developed a new conceptual readiness framework NOIIE (an acronym for National e-readiness, Organisational preparedness, Industrial relationships, Internal resistance and External influence), for assessing organisations’ readiness for emerging technologies and applications

    An evaluation of information and communication technology application in South African construction industry

    Get PDF
    Abstract: The construction industry is evolving like other allied industries. New innovations are borne out of the quest to achieve more value for money, while also retaining a competitive edge in the international sphere. A comprehensive study on the application of information and communication technology (ICT) for construction work in South Africa, particularly the stages of construction work, is lacking. This study seeks to evalaute information and communication technology tools used for construction activities in the South Africa construction industry. The research evaluates the level of awareness of construction professionals as to the use of new ICT tools in the fourth industrial revolution era. It also discusses the ICT tools used at the planning stage of construction, the design and the construction stage. It employed the Professional Client/Consultants Service Agreement Committee (PROCSA) template but limited it to stage 0 to 5. It also discusses the challenges, drivers and benefits of using ICT tools for construction activities in South Africa. The primary data was collected through a questionnaire which was distributed online via Questionpro platform to South African construction professionals in Gauteng Province only. One hundred and fifty (150) questionnaires were distributed. One hundred and twenty (120) of the responses were valid and used for the analysis. This accounted for eighty per cent (80%) of the total survey. In ensuring the reliability of the research questionnaire, Cronbach's alpha coefficient reliability was conducted on the scaled research questions. Compare mean was used to address the level of awareness of ICT tools and ICT tools used at the planning, design, and construction stages. Factor analysis was used to analyze the factors which serve as challenges to, drivers of, and benefits of the effective use of ICT tools. The study revealed that professionals have different awareness levels of ICT tools. They are more aware of ICT tools that are the core of their professional duties. At the planning stage of construction work in South Africa, all professionals use design/estimation and simulation-based tool most. In the design stage, the most frequently used tools are the computer-based tools and the design/estimation-based tools which are used by engineers, architects, and construction project managers. At the construction stage, computer-based tools and administrative tools are the highest-ranked tools. The exploratory factor analysis revealed that the challenges to the use of ICT in the South African construction industry are classified into people, cost, standardization, and management-related problems. The measures to ensure the effective use of ICT tools for construction processes in South Africa are also grouped into user-related factors, ICT knowledge and end-uses. The benefits from the effective use of ICT tools for construction...M.Tech. (Construction Management

    Taxonomy and uncertainties of cloud manufacturing

    Get PDF
    The manufacturing industry is currently undergoing rapid changes because of the rapid growth of advanced technologies in information systems and networks, which allow for collaboration around the world. This combination of the latest information technologies and advanced manufacturing networks has led to the growth of a new manufacturing model known as cloud manufacturing. Because cloud manufacturing is considered an emerging research area, there are significant gaps in the literature regarding the concept of cloud manufacturing, its implementation, and in particular the uncertainties coming with this new technology. This research aims to explain the concept of cloud manufacturing, its capabilities and potential. This work also introduces cloud manufacturing taxonomy, and investigates uncertainties that come with employing cloud manufacturing. Finally, proposals for future research in the context of cloud manufacturing are presented to address opportunities in cloud manufacturing

    View on 5G Architecture: Version 1.0

    Get PDF
    The current white paper focuses on the produced results after one year research mainly from 16 projects working on the abovementioned domains. During several months, representatives from these projects have worked together to identify the key findings of their projects and capture the commonalities and also the different approaches and trends. Also they have worked to determine the challenges that remain to be overcome so as to meet the 5G requirements. The goal of 5G Architecture Working Group is to use the results captured in this white paper to assist the participating projects achieve a common reference framework. The work of this working group will continue during the following year so as to capture the latest results to be produced by the projects and further elaborate this reference framework. The 5G networks will be built around people and things and will natively meet the requirements of three groups of use cases: • Massive broadband (xMBB) that delivers gigabytes of bandwidth on demand • Massive machine-type communication (mMTC) that connects billions of sensors and machines • Critical machine-type communication (uMTC) that allows immediate feedback with high reliability and enables for example remote control over robots and autonomous driving. The demand for mobile broadband will continue to increase in the next years, largely driven by the need to deliver ultra-high definition video. However, 5G networks will also be the platform enabling growth in many industries, ranging from the IT industry to the automotive, manufacturing industries entertainment, etc. 5G will enable new applications like for example autonomous driving, remote control of robots and tactile applications, but these also bring a lot of challenges to the network. Some of these are related to provide low latency in the order of few milliseconds and high reliability compared to fixed lines. But the biggest challenge for 5G networks will be that the services to cater for a diverse set of services and their requirements. To achieve this, the goal for 5G networks will be to improve the flexibility in the architecture. The white paper is organized as follows. In section 2 we discuss the key business and technical requirements that drive the evolution of 4G networks into the 5G. In section 3 we provide the key points of the overall 5G architecture where as in section 4 we elaborate on the functional architecture. Different issues related to the physical deployment in the access, metro and core networks of the 5G network are discussed in section 5 while in section 6 we present software network enablers that are expected to play a significant role in the future networks. Section 7 presents potential impacts on standardization and section 8 concludes the white paper
    • …
    corecore