42,400 research outputs found

    DeepSoft: A vision for a deep model of software

    Full text link
    Although software analytics has experienced rapid growth as a research area, it has not yet reached its full potential for wide industrial adoption. Most of the existing work in software analytics still relies heavily on costly manual feature engineering processes, and they mainly address the traditional classification problems, as opposed to predicting future events. We present a vision for \emph{DeepSoft}, an \emph{end-to-end} generic framework for modeling software and its development process to predict future risks and recommend interventions. DeepSoft, partly inspired by human memory, is built upon the powerful deep learning-based Long Short Term Memory architecture that is capable of learning long-term temporal dependencies that occur in software evolution. Such deep learned patterns of software can be used to address a range of challenging problems such as code and task recommendation and prediction. DeepSoft provides a new approach for research into modeling of source code, risk prediction and mitigation, developer modeling, and automatically generating code patches from bug reports.Comment: FSE 201

    Special Session on Industry 4.0

    Get PDF
    No abstract available

    Intelligent Management and Efficient Operation of Big Data

    Get PDF
    This chapter details how Big Data can be used and implemented in networking and computing infrastructures. Specifically, it addresses three main aspects: the timely extraction of relevant knowledge from heterogeneous, and very often unstructured large data sources, the enhancement on the performance of processing and networking (cloud) infrastructures that are the most important foundational pillars of Big Data applications or services, and novel ways to efficiently manage network infrastructures with high-level composed policies for supporting the transmission of large amounts of data with distinct requisites (video vs. non-video). A case study involving an intelligent management solution to route data traffic with diverse requirements in a wide area Internet Exchange Point is presented, discussed in the context of Big Data, and evaluated.Comment: In book Handbook of Research on Trends and Future Directions in Big Data and Web Intelligence, IGI Global, 201

    Developing the scales on evaluation beliefs of student teachers

    Get PDF
    The purpose of the study reported in this paper was to investigate the validity and the reliability of a newly developed questionnaire named ‘Teacher Evaluation Beliefs’ (TEB). The framework for developing items was provided by the two models. The first model focuses on Student-Centered and Teacher-Centered beliefs about evaluation while the other centers on five dimensions (what/ who/ when/ why/ how). The validity and reliability of the new instrument was investigated using both exploratory and confirmatory factor analysis study (n=446). Overall results indicate that the two-factor structure is more reasonable than the five-factor one. Further research needs additional items about the latent dimensions “what” ”who” ”when” ”why” “how” for each existing factor based on Student-centered and Teacher-centered approaches

    Digital technology and governance in transition: The case of the British Library

    Get PDF
    Comment on the organizational consequences of the new information and communications technologies (ICTs) is pervaded by a powerful imagery of disaggregation and a tendency for ?virtual? forms of production to be seen as synonymous with the ?end? of bureaucracy. This paper questions the underlying assumptions of the ?virtual organization?, highlighting the historically enduring, diversified character of the bureaucratic form. The paper then presents case study findings on the web-based access to information resources now being provided by the British Library (BL). The case study evidence produces two main findings. First, radically decentralised virtual forms of service delivery are heavily dependent on new forms of capacity-building and information aggregation. Second, digital technology is embedded in an inherently contested and contradictory context of institutional change. Current developments in the management and control of digital rights are consistent with the commodification of the public sphere. However, the evidence also suggests that scholarly access to information resources is being significantly influenced by the ?information society? objectives of the BL and other institutional players within the network of UK research libraries

    IEEE Access special section editorial: Artificial intelligence enabled networking

    Get PDF
    With today’s computer networks becoming increasingly dynamic, heterogeneous, and complex, there is great interest in deploying artificial intelligence (AI) based techniques for optimization and management of computer networks. AI techniques—that subsume multidisciplinary techniques from machine learning, optimization theory, game theory, control theory, and meta-heuristics—have long been applied to optimize computer networks in many diverse settings. Such an approach is gaining increased traction with the emergence of novel networking paradigms that promise to simplify network management (e.g., cloud computing, network functions virtualization, and software-defined networking) and provide intelligent services (e.g., future 5G mobile networks). Looking ahead, greater integration of AI into networking architectures can help develop a future vision of cognitive networks that will show network-wide intelligent behavior to solve problems of network heterogeneity, performance, and quality of service (QoS)
    • 

    corecore