8,454 research outputs found

    Evaluation Methodologies in Software Protection Research

    Full text link
    Man-at-the-end (MATE) attackers have full control over the system on which the attacked software runs, and try to break the confidentiality or integrity of assets embedded in the software. Both companies and malware authors want to prevent such attacks. This has driven an arms race between attackers and defenders, resulting in a plethora of different protection and analysis methods. However, it remains difficult to measure the strength of protections because MATE attackers can reach their goals in many different ways and a universally accepted evaluation methodology does not exist. This survey systematically reviews the evaluation methodologies of papers on obfuscation, a major class of protections against MATE attacks. For 572 papers, we collected 113 aspects of their evaluation methodologies, ranging from sample set types and sizes, over sample treatment, to performed measurements. We provide detailed insights into how the academic state of the art evaluates both the protections and analyses thereon. In summary, there is a clear need for better evaluation methodologies. We identify nine challenges for software protection evaluations, which represent threats to the validity, reproducibility, and interpretation of research results in the context of MATE attacks

    Approximate Computing Survey, Part I: Terminology and Software & Hardware Approximation Techniques

    Full text link
    The rapid growth of demanding applications in domains applying multimedia processing and machine learning has marked a new era for edge and cloud computing. These applications involve massive data and compute-intensive tasks, and thus, typical computing paradigms in embedded systems and data centers are stressed to meet the worldwide demand for high performance. Concurrently, the landscape of the semiconductor field in the last 15 years has constituted power as a first-class design concern. As a result, the community of computing systems is forced to find alternative design approaches to facilitate high-performance and/or power-efficient computing. Among the examined solutions, Approximate Computing has attracted an ever-increasing interest, with research works applying approximations across the entire traditional computing stack, i.e., at software, hardware, and architectural levels. Over the last decade, there is a plethora of approximation techniques in software (programs, frameworks, compilers, runtimes, languages), hardware (circuits, accelerators), and architectures (processors, memories). The current article is Part I of our comprehensive survey on Approximate Computing, and it reviews its motivation, terminology and principles, as well it classifies and presents the technical details of the state-of-the-art software and hardware approximation techniques.Comment: Under Review at ACM Computing Survey

    Technology for Low Resolution Space Based RSO Detection and Characterisation

    Get PDF
    Space Situational Awareness (SSA) refers to all activities to detect, identify and track objects in Earth orbit. SSA is critical to all current and future space activities and protect space assets by providing access control, conjunction warnings, and monitoring status of active satellites. Currently SSA methods and infrastructure are not sufficient to account for the proliferations of space debris. In response to the need for better SSA there has been many different areas of research looking to improve SSA most of the requiring dedicated ground or space-based infrastructure. In this thesis, a novel approach for the characterisation of RSO’s (Resident Space Objects) from passive low-resolution space-based sensors is presented with all the background work performed to enable this novel method. Low resolution space-based sensors are common on current satellites, with many of these sensors being in space using them passively to detect RSO’s can greatly augment SSA with out expensive infrastructure or long lead times. One of the largest hurtles to overcome with research in the area has to do with the lack of publicly available labelled data to test and confirm results with. To overcome this hurtle a simulation software, ORBITALS, was created. To verify and validate the ORBITALS simulator it was compared with the Fast Auroral Imager images, which is one of the only publicly available low-resolution space-based images found with auxiliary data. During the development of the ORBITALS simulator it was found that the generation of these simulated images are computationally intensive when propagating the entire space catalog. To overcome this an upgrade of the currently used propagation method, Specialised General Perturbation Method 4th order (SGP4), was performed to allow the algorithm to run in parallel reducing the computational time required to propagate entire catalogs of RSO’s. From the results it was found that the standard facet model with a particle swarm optimisation performed the best estimating an RSO’s attitude with a 0.66 degree RMSE accuracy across a sequence, and ~1% MAPE accuracy for the optical properties. This accomplished this thesis goal of demonstrating the feasibility of low-resolution passive RSO characterisation from space-based platforms in a simulated environment

    Formal description of ML models for unambiguous implementation

    Full text link
    Implementing deep neural networks in safety critical systems, in particular in the aeronautical domain, will require to offer adequate specification paradigms to preserve the semantics of the trained model on the final hardware platform. We propose to extend the nnef language in order to allow traceable distribution and parallelisation optimizations of a trained model. We show how such a specification can be implemented in cuda on a Xavier platform

    Deep Photonic Networks with Arbitrary and Broadband Functionality

    Full text link
    Growing application space in optical communications, computing, and sensing continues to drive the need for high-performance integrated photonic components. Designing these on-chip systems with complex and application-specific functionality requires beyond what is possible with physical intuition, for which machine learning-based design methods have recently become popular. However, as the expensive computational requirements for physically accurate device simulations last a critical challenge, these methods typically remain limited in scalability and the optical design degrees of freedom they can provide for application-specific and arbitrary photonic integrated circuits. Here, we introduce a highly-scalable, physics-informed framework for the design of on-chip optical systems with arbitrary functionality based on a deep photonic network of custom-designed Mach-Zehnder interferometers. Using this framework, we design ultra-broadband power splitters and a spectral duplexer, each in less than two minutes, and demonstrate state-of-the-art experimental performance with less than 0.66 dB insertion loss and over 120 nm of 1-dB bandwidth for all devices. Our presented framework provides an essential tool with a tractable path towards the systematic design of large-scale photonic systems with custom and broadband power, phase, and dispersion profiles for use in multi-band optical applications including high-throughput communications, quantum information processing, and medical/biological sensing

    Scaling up integrated photonic reservoirs towards low-power high-bandwidth computing

    No full text

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    INVESTIGATING THE PERCEPTION OF EXPATRIATES TOWARDS IMMIGRATION SERVICE QUALITY IN SHARJAH, UNITED ARAB EMIRATES THROUGH MIXED METHOD APPROACH

    Get PDF
    The public sectors in UAE are under immense pressure to demonstrate that their services are customer-focused and that continuous performance improvement is being delivered. The United Arab Emirates is a favoured destination for expatriates due to its own citizens form a minority of the population and are barely represented in the private sector workforce. These highly unusual demographics confer high importance on the national immigration services. Recently, increased interest in international migration, specifically within the United Arab Emirates, has been shown both by government agencies and by the governments of industrialised countries. Given the importance of the expatriate labour force to economic stability and growth in the Emirates, this research investigates how immigration services are perceived, with the aim of contributing to their improvement, thus ultimately supporting economic growth. It proposes a service quality perception framework to improve understanding within SID of how to raise levels of service delivered to migrants and other persons directly or indirectly affected by SID services. Qualitative data were collected by means of semi-structured interviews and quantitative data by means of a questionnaire survey based on the abovementioned framework. The survey data, on the variables influencing participants’ experiences and perceptions of SID services, were subjected to statistical analysis. The framework was then used to evaluate quality of service in terms of general impressions, delivery, location, response, SID culture and behaviour. Numerical data were analysed using inferential and descriptive statistics. It was found that service quality positively influenced service behaviour and that this relationship was mediated by SID culture. This research makes an original contribution to knowledge as one of the few studies of immigration to the United Arab Emirates. By examining the workings of one immigration department, it adds to the literature on immigration departments and organisational development in developing countries. It illuminates the mechanics of immigration services and demonstrates their increasing importance to the world economy

    Review of Methodologies to Assess Bridge Safety During and After Floods

    Get PDF
    This report summarizes a review of technologies used to monitor bridge scour with an emphasis on techniques appropriate for testing during and immediately after design flood conditions. The goal of this study is to identify potential technologies and strategies for Illinois Department of Transportation that may be used to enhance the reliability of bridge safety monitoring during floods from local to state levels. The research team conducted a literature review of technologies that have been explored by state departments of transportation (DOTs) and national agencies as well as state-of-the-art technologies that have not been extensively employed by DOTs. This review included informational interviews with representatives from DOTs and relevant industry organizations. Recommendations include considering (1) acquisition of tethered kneeboard or surf ski-mounted single-beam sonars for rapid deployment by local agencies, (2) acquisition of remote-controlled vessels mounted with single-beam and side-scan sonars for statewide deployment, (3) development of large-scale particle image velocimetry systems using remote-controlled drones for stream velocity and direction measurement during floods, (4) physical modeling to develop Illinois-specific hydrodynamic loading coefficients for Illinois bridges during flood conditions, and (5) development of holistic risk-based bridge assessment tools that incorporate structural, geotechnical, hydraulic, and scour measurements to provide rapid feedback for bridge closure decisions.IDOT-R27-SP50Ope
    corecore