16,053 research outputs found

    A systematic literature review on source code similarity measurement and clone detection: techniques, applications, and challenges

    Full text link
    Measuring and evaluating source code similarity is a fundamental software engineering activity that embraces a broad range of applications, including but not limited to code recommendation, duplicate code, plagiarism, malware, and smell detection. This paper proposes a systematic literature review and meta-analysis on code similarity measurement and evaluation techniques to shed light on the existing approaches and their characteristics in different applications. We initially found over 10000 articles by querying four digital libraries and ended up with 136 primary studies in the field. The studies were classified according to their methodology, programming languages, datasets, tools, and applications. A deep investigation reveals 80 software tools, working with eight different techniques on five application domains. Nearly 49% of the tools work on Java programs and 37% support C and C++, while there is no support for many programming languages. A noteworthy point was the existence of 12 datasets related to source code similarity measurement and duplicate codes, of which only eight datasets were publicly accessible. The lack of reliable datasets, empirical evaluations, hybrid methods, and focuses on multi-paradigm languages are the main challenges in the field. Emerging applications of code similarity measurement concentrate on the development phase in addition to the maintenance.Comment: 49 pages, 10 figures, 6 table

    Approximate Computing Survey, Part I: Terminology and Software & Hardware Approximation Techniques

    Full text link
    The rapid growth of demanding applications in domains applying multimedia processing and machine learning has marked a new era for edge and cloud computing. These applications involve massive data and compute-intensive tasks, and thus, typical computing paradigms in embedded systems and data centers are stressed to meet the worldwide demand for high performance. Concurrently, the landscape of the semiconductor field in the last 15 years has constituted power as a first-class design concern. As a result, the community of computing systems is forced to find alternative design approaches to facilitate high-performance and/or power-efficient computing. Among the examined solutions, Approximate Computing has attracted an ever-increasing interest, with research works applying approximations across the entire traditional computing stack, i.e., at software, hardware, and architectural levels. Over the last decade, there is a plethora of approximation techniques in software (programs, frameworks, compilers, runtimes, languages), hardware (circuits, accelerators), and architectures (processors, memories). The current article is Part I of our comprehensive survey on Approximate Computing, and it reviews its motivation, terminology and principles, as well it classifies and presents the technical details of the state-of-the-art software and hardware approximation techniques.Comment: Under Review at ACM Computing Survey

    Migrating Integration from SOAP to REST : Can the Advantages of Migration Justify the Project?

    Get PDF
    This thesis investigates the functional and conceptual differences between SOAP-based and RESTful web services and their implications in the context of a real-world migration project. The primary research questions addressed are: • What are the key functional and conceptual differences between SOAP-based and RESTful web services? • How can SOAP-based and RESTful service clients be implemented into a general client? • Can developing a client to work with REST and SOAP be justified based on differences in performance and maintainability? The thesis begins with a literature review of the core principles and features of SOAP and REST, highlighting their strengths, weaknesses, and suitability for different use cases. A detailed comparison table is provided to summarize the key differences between the two web services. The thesis presents a case study of a migration project from Lemonsoft's web team, which involved adapting an existing integration to support SOAP-based and RESTful services. The project utilized design patterns and a general client implementation to achieve a unified solution compatible with both protocols. In terms of performance, the evaluation showed that the general client led to faster execution times and reduced memory usage, enhancing the overall system efficiency. Additionally, improvements in maintainability were achieved by simplifying the codebase, using design patterns and object factories, adopting an interface-driven design, and promoting collaborative code reviews. These enhancements have not only resulted in a better user experience but also minimized future resource demands and maintenance costs. In conclusion, this thesis provides valuable insights into the functional and conceptual differences between SOAP-based and RESTful web services, the challenges and best practices for implementing a general client, and the justification for resource usage in such a solution based on performance and maintainability improvements

    Unified System on Chip RESTAPI Service (USOCRS)

    Get PDF
    Abstract. This thesis investigates the development of a Unified System on Chip RESTAPI Service (USOCRS) to enhance the efficiency and effectiveness of SOC verification reporting. The research aims to overcome the challenges associated with the transfer, utilization, and interpretation of SoC verification reports by creating a unified platform that integrates various tools and technologies. The research methodology used in this study follows a design science approach. A thorough literature review was conducted to explore existing approaches and technologies related to SOC verification reporting, automation, data visualization, and API development. The review revealed gaps in the current state of the field, providing a basis for further investigation. Using the insights gained from the literature review, a system design and implementation plan were developed. This plan makes use of cutting-edge technologies such as FASTAPI, SQL and NoSQL databases, Azure Active Directory for authentication, and Cloud services. The Verification Toolbox was employed to validate SoC reports based on the organization’s standards. The system went through manual testing, and user satisfaction was evaluated to ensure its functionality and usability. The results of this study demonstrate the successful design and implementation of the USOCRS, offering SOC engineers a unified and secure platform for uploading, validating, storing, and retrieving verification reports. The USOCRS facilitates seamless communication between users and the API, granting easy access to vital information including successes, failures, and test coverage derived from submitted SoC verification reports. By automating and standardizing the SOC verification reporting process, the USOCRS eliminates manual and repetitive tasks usually done by developers, thereby enhancing productivity, and establishing a robust and reliable framework for report storage and retrieval. Through the integration of diverse tools and technologies, the USOCRS presents a comprehensive solution that adheres to the required specifications of the SOC schema used within the organization. Furthermore, the USOCRS significantly improves the efficiency and effectiveness of SOC verification reporting. It facilitates the submission process, reduces latency through optimized data storage, and enables meaningful extraction and analysis of report data

    2023-2024 Boise State University Undergraduate Catalog

    Get PDF
    This catalog is primarily for and directed at students. However, it serves many audiences, such as high school counselors, academic advisors, and the public. In this catalog you will find an overview of Boise State University and information on admission, registration, grades, tuition and fees, financial aid, housing, student services, and other important policies and procedures. However, most of this catalog is devoted to describing the various programs and courses offered at Boise State

    Impact of language skills and system experience on medical information retrieval

    No full text

    One-sided differentiability: a challenge for computer algebra systems

    Get PDF
    Computer Algebra Systems (CASs) are extremely powerful and widely used digital tools. Focusing on differentiation, CASs include a command that computes the derivative of functions in one variable (and also the partial derivative of functions in several variables). We will focus in this article on real-valued functions of one real variable. Since CASs usually compute the derivative of real-valued functions as a whole, the value of the computed derivative at points where the left derivative and the right derivative are different (that we will call conflicting points) should be something like "undefined", although this isn't always the case: the output could strongly differ depending on the chosen CAS. We have analysed and compared in this article how some well-known CASs behave when addressing differentiation at the conflicting points of five different functions chosen by the authors. Finally, the ability for calculating one-sided limits of CASs allows to directly compute the result in these cumbersome cases using the formal definition of one-sided derivative, which we have also analysed and compared for the selected CASs. Regarding teaching, this is an important issue, as it is a topic of Secondary Education and nowadays the use of CASs as an auxiliary digital tool for teaching mathematics is very common

    Interdisciplinarity as a political instrument of governance and its consequences for doctoral training

    Get PDF
    UK educational policies exploit interdisciplinarity as a marketing tool in a competitive educational world by building images of prosperous futures for society, the economy, and universities. Following this narrative, interdisciplinary science is promoted as superior to disciplinary forms of research and requires the training of future researchers accordingly, with interdisciplinary doctoral education becoming more established in universities. This emphasis on the growth of interdisciplinary science polarises scholars’ views on the role of academic research between the production of knowledge on the one hand and knowledge as an economic resource at the other end of the spectrum. This research asks: what is the rationale behind the perceived value of interdisciplinary research and training, and how does it affect graduate students’ experiences of their PhD? Based on a practice theory perspective for its suitability in generating insights into how university’s social life is organised, reproduced and transformed, the doctorate is conceptualised as sets of interconnected practices that are observable as they happen. This current study, therefore, comprised two stages of data collection and analysis; the examination of documents to elucidate educational policy practices and an educational ethnography of an interdisciplinary doctoral programme. This study found interdisciplinary doctoral training is hindered by the lack of role models and positive social relationships, which are crucial to the way interdisciplinary students learn. Furthermore, it is argued that interdisciplinarity is sometimes applied to research as a label to fit with funders’ requirements. Specifically, in this case, medical optical imaging is best seen as an interdiscipline as it does not exhibit true interdisciplinary integration. Further insights show that while interdisciplinarity is promoted in policy around promises and expectations for a better future, it is in tension with how it is organisationally embedded in higher education. These insights form the basis for a list of practical recommendations for institutions. Overall, interdisciplinary doctoral training was observed to present students with difficulties and to leave policy concerns unaddressed

    A prototype of the data quality pipeline of the Online Observation Quality System of ASTRI-Mini Array telescope system

    Get PDF
    Gamma-ray astronomy investigates the physics of the universe and the characteristics of celestial objects through gamma rays. Gamma-rays are the most energetic part of the electromagnetic spectrum, emitted in some of the brightest events in the universe, such as pulsars, quasars, and supernova remnants. Gamma rays can be observed with satellites or ground-based telescopes. The latter allow to detect gamma rays in the very high energy range with the indirect Cherenkov technique. When highly energetic photons enter Earth's atmosphere, they generate air showers, cascades of particles whose fast motion produces elusive flashes of blue Cherenkov light in the sky. This thesis discusses the research conducted at the Astrophysics and Space Science Observatory of Bologna in collaboration with the international project, guided by INAF, for ground-based gamma-ray astrophysics, ASTRI Mini-Array. The focus is on the Online Observation Quality System (OOQS), which conducts a quick look analysis during the telescope observation. The Cherenkov Camera Data Quality Checker is the OOQS component that performs real-time quality checks on the data acquired at high frequency, up to 1000\,Hz, and with a total bandwidth of 148MB/s, from the nine Cherenkov Cameras. The thesis presents the implementation of the OOQS-Pipeline, a software prototype that receives scientific packets from a Cherenkov Camera, performs quality analysis, and stores the results. The pipeline consists of three main applications: Kafka-Consumer, DQ-Analysis, and DQ-Aggregator. The pipeline was tested on a server having similar performance as the ones of the Array Observing Site, and results indicate that it is possible to acquire the maximum data flow produced by the cameras. Overall, the thesis presents an important contribution to the ASTRI Mini-Array project, about the development of the first version of the OOQS-Pipeline, which will maximize observation time with quality data passing the verification thresholds

    Late-bound code generation

    Get PDF
    Each time a function or method is invoked during the execution of a program, a stream of instructions is issued to some underlying hardware platform. But exactly what underlying hardware, and which instructions, is usually left implicit. However in certain situations it becomes important to control these decisions. For example, particular problems can only be solved in real-time when scheduled on specialised accelerators, such as graphics coprocessors or computing clusters. We introduce a novel operator for hygienically reifying the behaviour of a runtime function instance as a syntactic fragment, in a language which may in general differ from the source function definition. Translation and optimisation are performed by recursively invoked, dynamically dispatched code generators. Side-effecting operations are permitted, and their ordering is preserved. We compare our operator with other techniques for pragmatic control, observing that: the use of our operator supports lifting arbitrary mutable objects, and neither requires rewriting sections of the source program in a multi-level language, nor interferes with the interface to individual software components. Due to its lack of interference at the abstraction level at which software is composed, we believe that our approach poses a significantly lower barrier to practical adoption than current methods. The practical efficacy of our operator is demonstrated by using it to offload the user interface rendering of a smartphone application to an FPGA coprocessor, including both statically and procedurally defined user interface components. The generated pipeline is an application-specific, statically scheduled processor-per-primitive rendering pipeline, suitable for place-and-route style optimisation. To demonstrate the compatibility of our operator with existing languages, we show how it may be defined within the Python programming language. We introduce a transformation for weakening mutable to immutable named bindings, termed let-weakening, to solve the problem of propagating information pertaining to named variables between modular code generating units.Open Acces
    • …
    corecore