1,235 research outputs found

    PRONOM-ROAR: Adding Format Profiles to a Repository Registry to Inform Preservation Services

    Get PDF
    To date many institutional repository (IR) software suppliers have pushed the IR as a digital preservation solution. We argue that the digital preservation of objects in IRs may better be achieved through the use of light-weight, add-on services. We present such a service – PRONOM-ROAR – that generates file format profiles for IRs. This demonstrates the potential of using third- party services to provide preservation expertise to IR managers by making use of existing machine interfaces to IRs

    RGG: A general GUI Framework for R scripts

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>R is the leading open source statistics software with a vast number of biostatistical and bioinformatical analysis packages. To exploit the advantages of R, extensive scripting/programming skills are required.</p> <p>Results</p> <p>We have developed a software tool called R GUI Generator (RGG) which enables the easy generation of Graphical User Interfaces (GUIs) for the programming language R by adding a few Extensible Markup Language (XML) – tags. RGG consists of an XML-based GUI definition language and a Java-based GUI engine. GUIs are generated in runtime from defined GUI tags that are embedded into the R script. User-GUI input is returned to the R code and replaces the XML-tags. RGG files can be developed using any text editor. The current version of RGG is available as a stand-alone software (RGGRunner) and as a plug-in for JGR.</p> <p>Conclusion</p> <p>RGG is a general GUI framework for R that has the potential to introduce R statistics (R packages, built-in functions and scripts) to users with limited programming skills and helps to bridge the gap between R developers and GUI-dependent users. RGG aims to abstract the GUI development from individual GUI toolkits by using an XML-based GUI definition language. Thus RGG can be easily integrated in any software. The RGG project further includes the development of a web-based repository for RGG-GUIs. RGG is an open source project licensed under the Lesser General Public License (LGPL) and can be downloaded freely at <url>http://rgg.r-forge.r-project.org</url></p

    Test Set Diameter: Quantifying the Diversity of Sets of Test Cases

    Full text link
    A common and natural intuition among software testers is that test cases need to differ if a software system is to be tested properly and its quality ensured. Consequently, much research has gone into formulating distance measures for how test cases, their inputs and/or their outputs differ. However, common to these proposals is that they are data type specific and/or calculate the diversity only between pairs of test inputs, traces or outputs. We propose a new metric to measure the diversity of sets of tests: the test set diameter (TSDm). It extends our earlier, pairwise test diversity metrics based on recent advances in information theory regarding the calculation of the normalized compression distance (NCD) for multisets. An advantage is that TSDm can be applied regardless of data type and on any test-related information, not only the test inputs. A downside is the increased computational time compared to competing approaches. Our experiments on four different systems show that the test set diameter can help select test sets with higher structural and fault coverage than random selection even when only applied to test inputs. This can enable early test design and selection, prior to even having a software system to test, and complement other types of test automation and analysis. We argue that this quantification of test set diversity creates a number of opportunities to better understand software quality and provides practical ways to increase it.Comment: In submissio

    Forensic Box for Quick Network-Based Security Assessments

    Get PDF
    Network security assessments are seen as important, yet cumbersome and time consuming tasks, mostly due to the use of different and manually operated tools. These are often very specialized tools that need to be mastered and combined, besides requiring sometimes that a testing environment is set up. Nonetheless, in many cases, it would be useful to obtain an audit in a swiftly and on-demand manner, even if with less detail. In such cases, these audits could be used as an initial step for a more detailed evaluation of the network security, as a complement to other audits, or aid in preventing major data leaks and system failures due to common configuration, management or implementation issues. This dissertation describes the work towards the design and development of a portable system for quick network security assessments and the research on the automation of many tasks (and associated tools) composing that process. An embodiment of such system was built using a Raspberry Pi 2, several well known open source tools, whose functions vary from network discovery, service identification, Operating System (OS) fingerprinting, network sniffing and vulnerability discovery, and custom scripts and programs for connecting all the different parts that comprise the system. The tools are integrated in a seamless manner with the system, to allow deployment in wired or wireless network environments, where the device carries out a mostly automated and thorough analysis. The device is near plug-and-play and produces a structured report at the end of the assessment. Several simple functions, such as re-scanning the network or doing Address Resolution Protocol (ARP) poisoning on the network are readily available through a small LCD display mounted on top of the device. It offers a web based interface for finer configuration of the several tools and viewing the report, aso developed within the scope of this work. Other specific outputs, such as PCAP files with collected traffic, are available for further analysis. The system was operated in controlled and real networks, so as to verify the quality of its assessments. The obtained results were compared with the results obtained through manually auditing the same networks. The achieved results showed that the device was able to detect many of the issues that the human auditor detected, but showed some shortcomings in terms of some specific vulnerabilities, mainly Structured Query Language (SQL) injections. The image of the OS with the pre-configured tools, automation scripts and programs is available for download from [Ber16b]. It comprises one of the main outputs of this work.As avaliações de segurança de uma rede (e dos seus dispositivos) são vistas como tarefas importantes, mas pesadas e que consomem bastante tempo, devido à utilização de diferentes ferramentas manuais. Normalmente, estas ferramentas são bastante especializadas e exigem conhecimento prévio e habituação, e muitas vezes a necessidade de criar um ambiente de teste. No entanto, em muitos casos, seria útil obter uma auditoria rápida e de forma mais direta, ainda que pouco profunda. Nesses moldes, poderia servir como passo inicial para uma avaliação mais detalhada, complementar outra auditoria, ou ainda ajudar a prevenir fugas de dados e falhas de sistemas devido a problemas comuns de configuração, gestão ou implementação dos sistemas. Esta dissertação descreve o trabalho efetuado com o objetivo de desenhar e desenvolver um sistema portátil para avaliações de segurança de uma rede de forma rápida, e também a investigação efetuada com vista à automação de várias tarefas (e ferramentas associadas) que compõem o processo de auditoria. Uma concretização do sistema foi criada utilizando um Raspberry Pi 2, várias ferramentas conhecidas e de código aberto, cujas funcionalidades variam entre descoberta da rede, identificação de sistema operativo, descoberta de vulnerabilidades a captura de tráfego na rede, e scripts e programas personalizados que interligam as várias partes que compõem o sistema. As ferramentas são integradas de forma transparente no sistema, que permite ser lançado em ambientes cablados ou wireless, onde o dispositivo executa uma análise meticulosa e maioritariamente automatizada. O dispositivo é praticamente plug and play e produz um relatório estruturado no final da avaliação. Várias funções simples, tais como analisar novamente a rede ou efetuar ataques de envenenamento da cache Address Resolution Protocol (ARP) na rede estão disponíveis através de um pequeno ecrã LCD montado no topo do dispositivo. Este oferece ainda uma interface web, também desenvolvida no contexto do trabalho, para configuração mais específica das várias ferramentas e para obter acesso ao relatório da avaliação. Outros outputs mais específicos, como ficheiros com tráfego capturado, estão disponíveis a partir desta interface. O sistema foi utilizado em redes controladas e reais, de forma a verificar a qualidade das suas avaliações. Os resultados obtidos foram comparados com aqueles obtidos através de auditoria manual efetuada às mesmas redes. Os resultados obtidos mostraram que o dispositivo deteta a maioria dos problemas que um auditor detetou manualmente, mas mostrou algumas falhas na deteção de algumas vulnerabilidades específicas, maioritariamente injeções Structured Query Language (SQL). A imagem do Sistema Operativo com as ferramentas pré-configuradas, scripts de automação e programas está disponível para download de [Ber16b]. Esta imagem corresponde a um dos principais resultados deste trabalho

    Design of Theoretical Framework: Global and Local Parameters Requirements for Libraries

    Get PDF
    Library is one of the important aspect in modern reading environment. Theoretical framework is an inevitable and indispensable for each and every library in the field of automated and digital library system. In this original research paper all the parameters have selected on the basis of global recommendations and local requirements for libraries in six theoretical sections. Designing the theoretical framework in the following areas such as (i) Theoretical framework of integrated library system cluster (ii) Theoretical framework of community communication and interaction (iii) Theoretical framework of digital media archiving cluster (iv) Theoretical framework of content management system (v) Theoretical framework of learning content management system (vi) Theoretical framework of federated search system. Integrated library system cluster two things are more important development of ILS and open source ILS software. On the other hand it also crafted the requirement of parameters selection and it can be developed in three ways such as basic parameters settings, theoretical framework for housekeeping operations, and theoretical framework for information retrieval system. Software selection and parameter selection is also an pivotal tasks in the field or theoretical framework of community communication and interaction. Theoretical framework of digital media archiving cluster can be developed in three sections such as selection of software, selection of standards, and metadata selection for all the libraries. Content management system can be developed in three ways such as workflow of content management system, software selection in CMS cluster, and parameters selection in CMS cluster. Development of theoretical framework of learning content management system for libraries in three sections such as Components of Learning Content Management System , Software selection in LCMS cluster, and Parameters selection in LCMS cluster. Software selection and parameters selection is also an important components in the federated search system theoretical framework for the development of single window based interface

    The Implementation and Audit of Internal Controls Regarding the Use of XBRL for Financial Statements

    Get PDF
    XBRL is a global standard developed to aid in more accurate and efficient business reporting by capitalizing on interactive data. All business processes possess benefits and risk; including XBRL based financial reporting. Management and Internal Auditors must determine the risks associated and implement internal controls to mediate the indicated risks. This paper explores the risks and controls associated with XBRL. The research was completed through performing a literature review of professional and academic articles. This information was backed up with information gained from various accounting professionals in both the internal and external audit fields. The Capstone seeks to answer the question: To what degree is XBRL-based financial statements being reviewed by internal audit compared to the amount of risk assessment deserved on this process. This paper discusses the control options for process owners to implement and test to audit the effectiveness of such controls. The research ultimately found that internal audit and external audit do very little to ensure the financial accuracy of XBRL-based financial statements, whiles companies still have full liability over this interactive data. The ultimate goal is to inform internal audit professionals on the importance of implementing and testing controls within XBRL-based financial reporting.B.A. (Bachelor of Arts

    Shortening Test Case Execution Time for Embedded Software

    Full text link

    Snazer: the simulations and networks analyzer

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Networks are widely recognized as key determinants of structure and function in systems that span the biological, physical, and social sciences. They are static pictures of the interactions among the components of complex systems. Often, much effort is required to identify networks as part of particular patterns as well as to visualize and interpret them.</p> <p>From a pure dynamical perspective, simulation represents a relevant <it>way</it>-<it>out</it>. Many simulator tools capitalized on the "noisy" behavior of some systems and used formal models to represent cellular activities as temporal trajectories. Statistical methods have been applied to a fairly large number of replicated trajectories in order to infer knowledge.</p> <p>A tool which both graphically manipulates reactive models and deals with sets of simulation time-course data by aggregation, interpretation and statistical analysis is missing and could add value to simulators.</p> <p>Results</p> <p>We designed and implemented <it>Snazer</it>, the simulations and networks analyzer. Its goal is to aid the processes of visualizing and manipulating reactive models, as well as to share and interpret time-course data produced by stochastic simulators or by any other means.</p> <p>Conclusions</p> <p><it>Snazer </it>is a solid prototype that integrates biological network and simulation time-course data analysis techniques.</p
    corecore